Nov 26 22:38:43 crc systemd[1]: Starting Kubernetes Kubelet... Nov 26 22:38:43 crc restorecon[4691]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 22:38:44 crc restorecon[4691]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 22:38:44 crc restorecon[4691]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 26 22:38:45 crc kubenswrapper[5008]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 22:38:45 crc kubenswrapper[5008]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 26 22:38:45 crc kubenswrapper[5008]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 22:38:45 crc kubenswrapper[5008]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 22:38:45 crc kubenswrapper[5008]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 26 22:38:45 crc kubenswrapper[5008]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.215617 5008 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225393 5008 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225424 5008 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225436 5008 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225445 5008 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225456 5008 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225465 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225474 5008 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225483 5008 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225491 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225499 5008 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225507 5008 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225515 5008 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225522 5008 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225530 5008 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225538 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225545 5008 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225553 5008 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225561 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225568 5008 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225575 5008 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225586 5008 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225608 5008 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225619 5008 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225629 5008 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225637 5008 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225646 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225656 5008 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225666 5008 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225676 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225684 5008 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225692 5008 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225700 5008 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225708 5008 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225716 5008 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225724 5008 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225732 5008 feature_gate.go:330] unrecognized feature gate: Example Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225740 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225748 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225756 5008 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225764 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225772 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225780 5008 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225788 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225795 5008 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225803 5008 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225810 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225819 5008 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225826 5008 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225834 5008 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225841 5008 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225849 5008 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225857 5008 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225864 5008 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225872 5008 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225879 5008 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225887 5008 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225894 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225902 5008 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225910 5008 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225917 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225925 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225933 5008 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225940 5008 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225947 5008 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225954 5008 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225985 5008 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.225995 5008 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.226003 5008 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.226010 5008 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.226020 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.226028 5008 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229259 5008 flags.go:64] FLAG: --address="0.0.0.0" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229287 5008 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229304 5008 flags.go:64] FLAG: --anonymous-auth="true" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229316 5008 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229327 5008 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229337 5008 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229349 5008 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229361 5008 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229370 5008 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229379 5008 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229388 5008 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229398 5008 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229407 5008 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229416 5008 flags.go:64] FLAG: --cgroup-root="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229424 5008 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229433 5008 flags.go:64] FLAG: --client-ca-file="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229442 5008 flags.go:64] FLAG: --cloud-config="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229451 5008 flags.go:64] FLAG: --cloud-provider="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229459 5008 flags.go:64] FLAG: --cluster-dns="[]" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229469 5008 flags.go:64] FLAG: --cluster-domain="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229478 5008 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229488 5008 flags.go:64] FLAG: --config-dir="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229505 5008 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229515 5008 flags.go:64] FLAG: --container-log-max-files="5" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229526 5008 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229534 5008 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229543 5008 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229553 5008 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229562 5008 flags.go:64] FLAG: --contention-profiling="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229570 5008 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229579 5008 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229589 5008 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229598 5008 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229609 5008 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229618 5008 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229626 5008 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229635 5008 flags.go:64] FLAG: --enable-load-reader="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229643 5008 flags.go:64] FLAG: --enable-server="true" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229652 5008 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229663 5008 flags.go:64] FLAG: --event-burst="100" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229672 5008 flags.go:64] FLAG: --event-qps="50" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229681 5008 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229690 5008 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229699 5008 flags.go:64] FLAG: --eviction-hard="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229709 5008 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229718 5008 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229727 5008 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229736 5008 flags.go:64] FLAG: --eviction-soft="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229745 5008 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229753 5008 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229762 5008 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229770 5008 flags.go:64] FLAG: --experimental-mounter-path="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229779 5008 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229788 5008 flags.go:64] FLAG: --fail-swap-on="true" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229796 5008 flags.go:64] FLAG: --feature-gates="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229807 5008 flags.go:64] FLAG: --file-check-frequency="20s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229816 5008 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229826 5008 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229836 5008 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229845 5008 flags.go:64] FLAG: --healthz-port="10248" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229854 5008 flags.go:64] FLAG: --help="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229864 5008 flags.go:64] FLAG: --hostname-override="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229872 5008 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229881 5008 flags.go:64] FLAG: --http-check-frequency="20s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229890 5008 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229898 5008 flags.go:64] FLAG: --image-credential-provider-config="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229907 5008 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229915 5008 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229924 5008 flags.go:64] FLAG: --image-service-endpoint="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229933 5008 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229942 5008 flags.go:64] FLAG: --kube-api-burst="100" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229951 5008 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229960 5008 flags.go:64] FLAG: --kube-api-qps="50" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.229993 5008 flags.go:64] FLAG: --kube-reserved="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230002 5008 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230010 5008 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230019 5008 flags.go:64] FLAG: --kubelet-cgroups="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230028 5008 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230037 5008 flags.go:64] FLAG: --lock-file="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230046 5008 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230054 5008 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230063 5008 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230076 5008 flags.go:64] FLAG: --log-json-split-stream="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230085 5008 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230093 5008 flags.go:64] FLAG: --log-text-split-stream="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230102 5008 flags.go:64] FLAG: --logging-format="text" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230111 5008 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230121 5008 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230129 5008 flags.go:64] FLAG: --manifest-url="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230138 5008 flags.go:64] FLAG: --manifest-url-header="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230149 5008 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230158 5008 flags.go:64] FLAG: --max-open-files="1000000" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230169 5008 flags.go:64] FLAG: --max-pods="110" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230178 5008 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230188 5008 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230197 5008 flags.go:64] FLAG: --memory-manager-policy="None" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230205 5008 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230214 5008 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230223 5008 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230232 5008 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230251 5008 flags.go:64] FLAG: --node-status-max-images="50" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230260 5008 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230269 5008 flags.go:64] FLAG: --oom-score-adj="-999" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230278 5008 flags.go:64] FLAG: --pod-cidr="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230286 5008 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230300 5008 flags.go:64] FLAG: --pod-manifest-path="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230308 5008 flags.go:64] FLAG: --pod-max-pids="-1" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230318 5008 flags.go:64] FLAG: --pods-per-core="0" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230326 5008 flags.go:64] FLAG: --port="10250" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230335 5008 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230344 5008 flags.go:64] FLAG: --provider-id="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230353 5008 flags.go:64] FLAG: --qos-reserved="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230361 5008 flags.go:64] FLAG: --read-only-port="10255" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230370 5008 flags.go:64] FLAG: --register-node="true" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230379 5008 flags.go:64] FLAG: --register-schedulable="true" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230388 5008 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230401 5008 flags.go:64] FLAG: --registry-burst="10" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230410 5008 flags.go:64] FLAG: --registry-qps="5" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230418 5008 flags.go:64] FLAG: --reserved-cpus="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230428 5008 flags.go:64] FLAG: --reserved-memory="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230441 5008 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230450 5008 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230459 5008 flags.go:64] FLAG: --rotate-certificates="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230468 5008 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230476 5008 flags.go:64] FLAG: --runonce="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230485 5008 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230494 5008 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230503 5008 flags.go:64] FLAG: --seccomp-default="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230512 5008 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230520 5008 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230530 5008 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230539 5008 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230548 5008 flags.go:64] FLAG: --storage-driver-password="root" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230556 5008 flags.go:64] FLAG: --storage-driver-secure="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230565 5008 flags.go:64] FLAG: --storage-driver-table="stats" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230574 5008 flags.go:64] FLAG: --storage-driver-user="root" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230583 5008 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230592 5008 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230601 5008 flags.go:64] FLAG: --system-cgroups="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230609 5008 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230622 5008 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230631 5008 flags.go:64] FLAG: --tls-cert-file="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230640 5008 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230649 5008 flags.go:64] FLAG: --tls-min-version="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230658 5008 flags.go:64] FLAG: --tls-private-key-file="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230667 5008 flags.go:64] FLAG: --topology-manager-policy="none" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230675 5008 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230685 5008 flags.go:64] FLAG: --topology-manager-scope="container" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230694 5008 flags.go:64] FLAG: --v="2" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230706 5008 flags.go:64] FLAG: --version="false" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230717 5008 flags.go:64] FLAG: --vmodule="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230728 5008 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.230738 5008 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.230929 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.230938 5008 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.230946 5008 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.230954 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.230986 5008 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.230995 5008 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231003 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231013 5008 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231023 5008 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231033 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231041 5008 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231050 5008 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231060 5008 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231073 5008 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231082 5008 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231090 5008 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231098 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231106 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231116 5008 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231125 5008 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231133 5008 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231140 5008 feature_gate.go:330] unrecognized feature gate: Example Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231148 5008 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231155 5008 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231162 5008 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231170 5008 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231178 5008 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231185 5008 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231193 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231200 5008 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231213 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231221 5008 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231228 5008 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231235 5008 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231243 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231253 5008 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231263 5008 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231272 5008 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231280 5008 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231288 5008 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231296 5008 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231305 5008 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231313 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231321 5008 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231329 5008 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231337 5008 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231345 5008 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231352 5008 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231360 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231368 5008 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231376 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231384 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231391 5008 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231398 5008 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231406 5008 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231414 5008 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231422 5008 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231430 5008 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231438 5008 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231445 5008 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231452 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231461 5008 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231471 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231479 5008 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231486 5008 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231493 5008 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231501 5008 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231510 5008 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231517 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231525 5008 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.231532 5008 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.231556 5008 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.248599 5008 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.248665 5008 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248782 5008 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248794 5008 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248799 5008 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248805 5008 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248809 5008 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248813 5008 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248818 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248823 5008 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248829 5008 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248835 5008 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248843 5008 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248848 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248852 5008 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248856 5008 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248860 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248864 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248868 5008 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248874 5008 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248878 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248884 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248895 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248899 5008 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248903 5008 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248907 5008 feature_gate.go:330] unrecognized feature gate: Example Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248911 5008 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248915 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248919 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248923 5008 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248927 5008 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248931 5008 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248935 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248941 5008 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248946 5008 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248953 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248958 5008 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248977 5008 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248981 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248985 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248989 5008 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.248994 5008 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249000 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249005 5008 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249011 5008 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249016 5008 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249021 5008 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249026 5008 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249031 5008 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249035 5008 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249039 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249044 5008 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249048 5008 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249054 5008 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249062 5008 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249067 5008 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249072 5008 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249076 5008 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249081 5008 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249085 5008 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249090 5008 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249094 5008 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249098 5008 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249102 5008 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249106 5008 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249110 5008 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249114 5008 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249119 5008 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249124 5008 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249128 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249132 5008 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249136 5008 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249140 5008 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.249149 5008 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249306 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249315 5008 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249319 5008 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249323 5008 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249328 5008 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249332 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249336 5008 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249340 5008 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249345 5008 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249349 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249353 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249356 5008 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249361 5008 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249366 5008 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249370 5008 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249374 5008 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249378 5008 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249382 5008 feature_gate.go:330] unrecognized feature gate: Example Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249386 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249391 5008 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249398 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249402 5008 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249407 5008 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249411 5008 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249415 5008 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249420 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249423 5008 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249427 5008 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249431 5008 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249435 5008 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249439 5008 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249443 5008 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249447 5008 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249451 5008 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249454 5008 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249458 5008 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249462 5008 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249466 5008 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249470 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249474 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249477 5008 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249481 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249485 5008 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249489 5008 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249496 5008 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249500 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249505 5008 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249509 5008 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249512 5008 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249517 5008 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249520 5008 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249524 5008 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249528 5008 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249532 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249536 5008 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249541 5008 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249546 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249551 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249555 5008 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249559 5008 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249563 5008 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249567 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249570 5008 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249574 5008 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249578 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249582 5008 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249587 5008 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249592 5008 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249596 5008 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249601 5008 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.249604 5008 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.249612 5008 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.251722 5008 server.go:940] "Client rotation is on, will bootstrap in background" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.258716 5008 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.258845 5008 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.261864 5008 server.go:997] "Starting client certificate rotation" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.261894 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.263341 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-16 22:49:02.361146196 +0000 UTC Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.263465 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.294912 5008 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.298001 5008 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.301051 5008 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.329238 5008 log.go:25] "Validated CRI v1 runtime API" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.381043 5008 log.go:25] "Validated CRI v1 image API" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.383551 5008 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.388484 5008 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-26-22-34-14-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.388523 5008 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.414408 5008 manager.go:217] Machine: {Timestamp:2025-11-26 22:38:45.410481834 +0000 UTC m=+0.823175916 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:50254066-f1e7-4e4d-8ba4-3174542eac6b BootID:286285dc-f4e0-4a95-97ca-f92c5aacc002 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:93:b1:a3 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:93:b1:a3 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a5:47:d9 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:6d:c3:df Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:f4:38:1d Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:53:e4:03 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:be:eb:4c:ee:8c:64 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:6a:75:b7:58:a6:e9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.415255 5008 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.415530 5008 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.417241 5008 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.417590 5008 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.417648 5008 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.420510 5008 topology_manager.go:138] "Creating topology manager with none policy" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.420562 5008 container_manager_linux.go:303] "Creating device plugin manager" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.421309 5008 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.421362 5008 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.421625 5008 state_mem.go:36] "Initialized new in-memory state store" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.421767 5008 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.427942 5008 kubelet.go:418] "Attempting to sync node with API server" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.428019 5008 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.428059 5008 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.428090 5008 kubelet.go:324] "Adding apiserver pod source" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.428114 5008 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.435011 5008 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.437079 5008 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.438326 5008 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.438806 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.438939 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.438899 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.439073 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.441612 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.441639 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.441647 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.441657 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.441670 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.441678 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.441687 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.441699 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.441708 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.441719 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.441732 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.441741 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.445741 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.446202 5008 server.go:1280] "Started kubelet" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.446372 5008 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.447139 5008 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.447792 5008 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 26 22:38:45 crc systemd[1]: Started Kubernetes Kubelet. Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.448893 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.453058 5008 server.go:460] "Adding debug handlers to kubelet server" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.454620 5008 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.454667 5008 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.454951 5008 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 22:02:41.951907089 +0000 UTC Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.455111 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.455156 5008 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.455185 5008 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.455737 5008 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.455820 5008 factory.go:55] Registering systemd factory Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.455836 5008 factory.go:221] Registration of the systemd container factory successfully Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.456221 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="200ms" Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.456224 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.456280 5008 factory.go:153] Registering CRI-O factory Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.456297 5008 factory.go:221] Registration of the crio container factory successfully Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.456295 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.456374 5008 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.456406 5008 factory.go:103] Registering Raw factory Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.456426 5008 manager.go:1196] Started watching for new ooms in manager Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.457267 5008 manager.go:319] Starting recovery of all containers Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473660 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473733 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473750 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473768 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473785 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473801 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473815 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473845 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473865 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473881 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473900 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473928 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473942 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.473976 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.472602 5008 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.166:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187baf931e206172 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 22:38:45.446173042 +0000 UTC m=+0.858867044,LastTimestamp:2025-11-26 22:38:45.446173042 +0000 UTC m=+0.858867044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474037 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474119 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474158 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474186 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474209 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474232 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474255 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474277 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474300 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474321 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474343 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474367 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474397 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474420 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474442 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474466 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474489 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474512 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474534 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474555 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474576 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474602 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474668 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474691 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474714 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474735 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474761 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474784 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474847 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474900 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474920 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474941 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.474986 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475007 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475034 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475061 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475082 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475102 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475129 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475152 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475174 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475196 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475218 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475240 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475264 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475287 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475310 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475332 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475352 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475382 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475400 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475420 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475441 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475462 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475481 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475502 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475554 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475574 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475594 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475614 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475636 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475659 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475680 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475704 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475723 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475747 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475766 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475787 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475807 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475828 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475850 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475872 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475893 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475911 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475933 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475953 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.475998 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476019 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476039 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476059 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476080 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476101 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476120 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476217 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476249 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476276 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476298 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476318 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476339 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476360 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476394 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476420 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476444 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476467 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476490 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476513 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476537 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476560 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476581 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476605 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476627 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476648 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476668 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476690 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476711 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476731 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476751 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476771 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476794 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476818 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476839 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476862 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476882 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476903 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476923 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476945 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.476993 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477021 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477048 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477074 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477100 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477123 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477145 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477164 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477187 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477213 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477242 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477262 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477282 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477303 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477323 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477347 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477367 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477386 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477405 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477429 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477450 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477473 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477493 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477513 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477535 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477555 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477575 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477597 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477618 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.477640 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480107 5008 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480192 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480216 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480233 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480249 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480267 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480287 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480303 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480321 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480342 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480361 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480380 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480399 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480415 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480431 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480446 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480462 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480481 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480498 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480512 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480532 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480547 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480565 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480583 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480638 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480655 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480700 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480718 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480736 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480752 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480769 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480783 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480798 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480813 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480828 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480844 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480861 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480877 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480893 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480911 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480928 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480944 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.480994 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.481010 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.481029 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.481044 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.481061 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.481077 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.481093 5008 reconstruct.go:97] "Volume reconstruction finished" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.481103 5008 reconciler.go:26] "Reconciler: start to sync state" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.484791 5008 manager.go:324] Recovery completed Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.493821 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.495798 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.495837 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.495848 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.496636 5008 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.496652 5008 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.496677 5008 state_mem.go:36] "Initialized new in-memory state store" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.513499 5008 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.515850 5008 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.517097 5008 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.517151 5008 kubelet.go:2335] "Starting kubelet main sync loop" Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.517425 5008 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 26 22:38:45 crc kubenswrapper[5008]: W1126 22:38:45.517761 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.517859 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.521258 5008 policy_none.go:49] "None policy: Start" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.521802 5008 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.521827 5008 state_mem.go:35] "Initializing new in-memory state store" Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.555716 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.582090 5008 manager.go:334] "Starting Device Plugin manager" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.582173 5008 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.582187 5008 server.go:79] "Starting device plugin registration server" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.582619 5008 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.582638 5008 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.582807 5008 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.583067 5008 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.583093 5008 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.591891 5008 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.618271 5008 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.618409 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.621416 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.621467 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.621486 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.622529 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.622599 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.622604 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.624235 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.624279 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.624291 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.624298 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.624372 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.624522 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.624562 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.625131 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.625197 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.625922 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.626028 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.626056 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.626345 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.626384 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.626408 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.626607 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.626693 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.626748 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.627894 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.627937 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.627954 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.628403 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.628501 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.628533 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.628783 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.629043 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.629109 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.630570 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.630619 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.630642 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.630925 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.631018 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.631159 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.631285 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.631399 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.634846 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.634885 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.634909 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.656871 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="400ms" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.682736 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683068 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683132 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683164 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683185 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683227 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683248 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683289 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683309 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683331 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683373 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683396 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683423 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683448 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683472 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683498 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683846 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683889 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683905 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.683942 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.684583 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.784717 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.784786 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.784815 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.784836 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.784855 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.784877 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.784907 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.784932 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.784943 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785033 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.784955 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785084 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785112 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785120 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785146 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785147 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785178 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785216 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785188 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785228 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785246 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785189 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785258 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785278 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785311 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785283 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785378 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785405 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785452 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.785572 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.884944 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.886514 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.886553 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.886565 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.886592 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 22:38:45 crc kubenswrapper[5008]: E1126 22:38:45.887070 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.952870 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.965432 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 26 22:38:45 crc kubenswrapper[5008]: I1126 22:38:45.995641 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.014104 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.036436 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 22:38:46 crc kubenswrapper[5008]: E1126 22:38:46.058149 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="800ms" Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.288036 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.289750 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.289897 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.289949 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.290055 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 22:38:46 crc kubenswrapper[5008]: E1126 22:38:46.290987 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Nov 26 22:38:46 crc kubenswrapper[5008]: W1126 22:38:46.427541 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-61721e320a74ed2215fe28a9f2cc3ed4971d2b50853004b0cd8c6698be2ac2cf WatchSource:0}: Error finding container 61721e320a74ed2215fe28a9f2cc3ed4971d2b50853004b0cd8c6698be2ac2cf: Status 404 returned error can't find the container with id 61721e320a74ed2215fe28a9f2cc3ed4971d2b50853004b0cd8c6698be2ac2cf Nov 26 22:38:46 crc kubenswrapper[5008]: W1126 22:38:46.433616 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-835b8d3f9621364a12496ddcf7933079ace2f67cf16387c8a29e060846653432 WatchSource:0}: Error finding container 835b8d3f9621364a12496ddcf7933079ace2f67cf16387c8a29e060846653432: Status 404 returned error can't find the container with id 835b8d3f9621364a12496ddcf7933079ace2f67cf16387c8a29e060846653432 Nov 26 22:38:46 crc kubenswrapper[5008]: W1126 22:38:46.435554 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-c851a6bb60451bc25b0714a94c700cd683e97e8e84c60926a5a770348ac384a0 WatchSource:0}: Error finding container c851a6bb60451bc25b0714a94c700cd683e97e8e84c60926a5a770348ac384a0: Status 404 returned error can't find the container with id c851a6bb60451bc25b0714a94c700cd683e97e8e84c60926a5a770348ac384a0 Nov 26 22:38:46 crc kubenswrapper[5008]: W1126 22:38:46.436581 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-509ddad49568b95b7eed9755c52a6594e28985755cc87613b37577f51669b426 WatchSource:0}: Error finding container 509ddad49568b95b7eed9755c52a6594e28985755cc87613b37577f51669b426: Status 404 returned error can't find the container with id 509ddad49568b95b7eed9755c52a6594e28985755cc87613b37577f51669b426 Nov 26 22:38:46 crc kubenswrapper[5008]: W1126 22:38:46.439658 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:46 crc kubenswrapper[5008]: E1126 22:38:46.439826 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.449685 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.455828 5008 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 21:20:30.305722238 +0000 UTC Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.455911 5008 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 430h41m43.849813681s for next certificate rotation Nov 26 22:38:46 crc kubenswrapper[5008]: W1126 22:38:46.475729 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:46 crc kubenswrapper[5008]: E1126 22:38:46.475851 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.522128 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"835b8d3f9621364a12496ddcf7933079ace2f67cf16387c8a29e060846653432"} Nov 26 22:38:46 crc kubenswrapper[5008]: W1126 22:38:46.522618 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-fc38c0f80d2dbebeb0a0c1170322cc5944ab94bde1e744c9df8e39e0c75dfe66 WatchSource:0}: Error finding container fc38c0f80d2dbebeb0a0c1170322cc5944ab94bde1e744c9df8e39e0c75dfe66: Status 404 returned error can't find the container with id fc38c0f80d2dbebeb0a0c1170322cc5944ab94bde1e744c9df8e39e0c75dfe66 Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.524694 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"61721e320a74ed2215fe28a9f2cc3ed4971d2b50853004b0cd8c6698be2ac2cf"} Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.526750 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"509ddad49568b95b7eed9755c52a6594e28985755cc87613b37577f51669b426"} Nov 26 22:38:46 crc kubenswrapper[5008]: I1126 22:38:46.528337 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c851a6bb60451bc25b0714a94c700cd683e97e8e84c60926a5a770348ac384a0"} Nov 26 22:38:46 crc kubenswrapper[5008]: W1126 22:38:46.536457 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:46 crc kubenswrapper[5008]: E1126 22:38:46.536608 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:46 crc kubenswrapper[5008]: E1126 22:38:46.859754 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="1.6s" Nov 26 22:38:47 crc kubenswrapper[5008]: W1126 22:38:47.076021 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:47 crc kubenswrapper[5008]: E1126 22:38:47.076150 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:47 crc kubenswrapper[5008]: I1126 22:38:47.091575 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:47 crc kubenswrapper[5008]: I1126 22:38:47.093424 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:47 crc kubenswrapper[5008]: I1126 22:38:47.093473 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:47 crc kubenswrapper[5008]: I1126 22:38:47.093490 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:47 crc kubenswrapper[5008]: I1126 22:38:47.093521 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 22:38:47 crc kubenswrapper[5008]: E1126 22:38:47.093885 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Nov 26 22:38:47 crc kubenswrapper[5008]: I1126 22:38:47.333561 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 26 22:38:47 crc kubenswrapper[5008]: E1126 22:38:47.334795 5008 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:47 crc kubenswrapper[5008]: I1126 22:38:47.449998 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:47 crc kubenswrapper[5008]: I1126 22:38:47.532681 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fc38c0f80d2dbebeb0a0c1170322cc5944ab94bde1e744c9df8e39e0c75dfe66"} Nov 26 22:38:48 crc kubenswrapper[5008]: W1126 22:38:48.307742 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:48 crc kubenswrapper[5008]: E1126 22:38:48.307877 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.450292 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:48 crc kubenswrapper[5008]: E1126 22:38:48.461040 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="3.2s" Nov 26 22:38:48 crc kubenswrapper[5008]: W1126 22:38:48.537867 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:48 crc kubenswrapper[5008]: E1126 22:38:48.538041 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.539185 5008 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b" exitCode=0 Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.539291 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.539292 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b"} Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.541106 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.541166 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.541183 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.542543 5008 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f" exitCode=0 Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.542701 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f"} Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.542902 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.544670 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.544732 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.544760 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.545149 5008 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d" exitCode=0 Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.545198 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d"} Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.545256 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.547189 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.547231 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.547248 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.553016 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6" exitCode=0 Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.553069 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6"} Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.553180 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.554485 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.554533 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.554551 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.556335 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb"} Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.556380 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222"} Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.556400 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61"} Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.563909 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.565396 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.565448 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.565473 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.694152 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.695593 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.695643 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.695664 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:48 crc kubenswrapper[5008]: I1126 22:38:48.695702 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 22:38:48 crc kubenswrapper[5008]: E1126 22:38:48.696462 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Nov 26 22:38:49 crc kubenswrapper[5008]: W1126 22:38:49.248225 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:49 crc kubenswrapper[5008]: E1126 22:38:49.248333 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.450548 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.561505 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"071f58b25e93d75bba1ef6b9123717c1e8887e869077efebf70fc55f91d8fe76"} Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.561555 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c94ef8187fdbc6022fe2f451e4d267e0311448a6820236175adce557f66f2bd9"} Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.561570 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"709d4c7c869a6123fb3da1ed7bdc57c002e11b9eabcd3a44112be80f50f101a4"} Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.561679 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.563005 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.563058 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.563073 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.566029 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2d8a32772c75918a8f47660b624d60692798964409588b64b0ba25f26d3b061a"} Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.566071 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.567352 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.567386 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.567396 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.575746 5008 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8" exitCode=0 Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.575838 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8"} Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.575879 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.576937 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.576991 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.577005 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.581802 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f"} Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.581847 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e"} Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.581861 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801"} Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.581873 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a"} Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.594727 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531"} Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.594817 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.597519 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.597549 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:49 crc kubenswrapper[5008]: I1126 22:38:49.597560 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:49 crc kubenswrapper[5008]: W1126 22:38:49.816266 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 26 22:38:49 crc kubenswrapper[5008]: E1126 22:38:49.816723 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.602299 5008 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0" exitCode=0 Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.602473 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0"} Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.602483 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.603878 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.603932 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.603950 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.608122 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a"} Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.608218 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.608292 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.608321 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.608382 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.609060 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.609542 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.609593 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.609613 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.609839 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.609882 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.609904 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.610050 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.610263 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.610280 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.611193 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.611238 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:50 crc kubenswrapper[5008]: I1126 22:38:50.611256 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.593330 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.618404 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66"} Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.618465 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff"} Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.618487 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502"} Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.618494 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.618556 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.618606 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.619892 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.619931 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.619942 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.620026 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.620047 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.619948 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.898129 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.899762 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.899827 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.899846 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:51 crc kubenswrapper[5008]: I1126 22:38:51.899882 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 22:38:52 crc kubenswrapper[5008]: I1126 22:38:52.259704 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:52 crc kubenswrapper[5008]: I1126 22:38:52.631282 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0"} Nov 26 22:38:52 crc kubenswrapper[5008]: I1126 22:38:52.631347 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 22:38:52 crc kubenswrapper[5008]: I1126 22:38:52.631366 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b"} Nov 26 22:38:52 crc kubenswrapper[5008]: I1126 22:38:52.631399 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:52 crc kubenswrapper[5008]: I1126 22:38:52.631456 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:52 crc kubenswrapper[5008]: I1126 22:38:52.632857 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:52 crc kubenswrapper[5008]: I1126 22:38:52.632859 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:52 crc kubenswrapper[5008]: I1126 22:38:52.632891 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:52 crc kubenswrapper[5008]: I1126 22:38:52.632934 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:52 crc kubenswrapper[5008]: I1126 22:38:52.633006 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:52 crc kubenswrapper[5008]: I1126 22:38:52.633008 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.112258 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.190299 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.190583 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.191885 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.191927 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.191944 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.196594 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.409285 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.634901 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.635072 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.635091 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.637570 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.637613 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.637630 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.637739 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.637777 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.637793 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.637741 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.637848 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:53 crc kubenswrapper[5008]: I1126 22:38:53.637864 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:54 crc kubenswrapper[5008]: I1126 22:38:54.637812 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:54 crc kubenswrapper[5008]: I1126 22:38:54.638847 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:54 crc kubenswrapper[5008]: I1126 22:38:54.638894 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:54 crc kubenswrapper[5008]: I1126 22:38:54.638905 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:54 crc kubenswrapper[5008]: I1126 22:38:54.923541 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 26 22:38:54 crc kubenswrapper[5008]: I1126 22:38:54.924030 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:54 crc kubenswrapper[5008]: I1126 22:38:54.925664 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:54 crc kubenswrapper[5008]: I1126 22:38:54.925740 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:54 crc kubenswrapper[5008]: I1126 22:38:54.925768 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:55 crc kubenswrapper[5008]: E1126 22:38:55.592012 5008 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 22:38:55 crc kubenswrapper[5008]: I1126 22:38:55.690199 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:55 crc kubenswrapper[5008]: I1126 22:38:55.690575 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:55 crc kubenswrapper[5008]: I1126 22:38:55.691889 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:55 crc kubenswrapper[5008]: I1126 22:38:55.691959 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:55 crc kubenswrapper[5008]: I1126 22:38:55.692015 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:55 crc kubenswrapper[5008]: I1126 22:38:55.716915 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:56 crc kubenswrapper[5008]: I1126 22:38:56.624843 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:56 crc kubenswrapper[5008]: I1126 22:38:56.643539 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:56 crc kubenswrapper[5008]: I1126 22:38:56.644663 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:56 crc kubenswrapper[5008]: I1126 22:38:56.644723 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:56 crc kubenswrapper[5008]: I1126 22:38:56.644764 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:56 crc kubenswrapper[5008]: I1126 22:38:56.694325 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 26 22:38:56 crc kubenswrapper[5008]: I1126 22:38:56.694592 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:56 crc kubenswrapper[5008]: I1126 22:38:56.696586 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:56 crc kubenswrapper[5008]: I1126 22:38:56.696651 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:56 crc kubenswrapper[5008]: I1126 22:38:56.696674 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:57 crc kubenswrapper[5008]: I1126 22:38:57.647233 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:57 crc kubenswrapper[5008]: I1126 22:38:57.649117 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:57 crc kubenswrapper[5008]: I1126 22:38:57.649196 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:57 crc kubenswrapper[5008]: I1126 22:38:57.649222 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:57 crc kubenswrapper[5008]: I1126 22:38:57.654430 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:38:58 crc kubenswrapper[5008]: I1126 22:38:58.650506 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:38:58 crc kubenswrapper[5008]: I1126 22:38:58.651656 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:38:58 crc kubenswrapper[5008]: I1126 22:38:58.651719 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:38:58 crc kubenswrapper[5008]: I1126 22:38:58.651737 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:38:58 crc kubenswrapper[5008]: I1126 22:38:58.717282 5008 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 26 22:38:58 crc kubenswrapper[5008]: I1126 22:38:58.717408 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 22:39:00 crc kubenswrapper[5008]: I1126 22:39:00.451384 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 26 22:39:01 crc kubenswrapper[5008]: I1126 22:39:01.022715 5008 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 26 22:39:01 crc kubenswrapper[5008]: I1126 22:39:01.022805 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 26 22:39:01 crc kubenswrapper[5008]: I1126 22:39:01.031334 5008 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 26 22:39:01 crc kubenswrapper[5008]: I1126 22:39:01.031405 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 26 22:39:02 crc kubenswrapper[5008]: I1126 22:39:02.265819 5008 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]log ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]etcd ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-api-request-count-filter ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-startkubeinformers ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/generic-apiserver-start-informers ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/priority-and-fairness-config-consumer ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/priority-and-fairness-filter ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/start-apiextensions-informers ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/start-apiextensions-controllers ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/crd-informer-synced ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/start-system-namespaces-controller ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/start-cluster-authentication-info-controller ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/start-legacy-token-tracking-controller ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/start-service-ip-repair-controllers ok Nov 26 22:39:02 crc kubenswrapper[5008]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/priority-and-fairness-config-producer ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/bootstrap-controller ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/start-kube-aggregator-informers ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/apiservice-status-local-available-controller ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/apiservice-status-remote-available-controller ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/apiservice-registration-controller ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/apiservice-wait-for-first-sync ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/apiservice-discovery-controller ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/kube-apiserver-autoregistration ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]autoregister-completion ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/apiservice-openapi-controller ok Nov 26 22:39:02 crc kubenswrapper[5008]: [+]poststarthook/apiservice-openapiv3-controller ok Nov 26 22:39:02 crc kubenswrapper[5008]: livez check failed Nov 26 22:39:02 crc kubenswrapper[5008]: I1126 22:39:02.265888 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:39:05 crc kubenswrapper[5008]: E1126 22:39:05.592180 5008 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.026602 5008 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.034336 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.037218 5008 trace.go:236] Trace[1070984673]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 22:38:52.160) (total time: 13876ms): Nov 26 22:39:06 crc kubenswrapper[5008]: Trace[1070984673]: ---"Objects listed" error: 13876ms (22:39:06.037) Nov 26 22:39:06 crc kubenswrapper[5008]: Trace[1070984673]: [13.876247904s] [13.876247904s] END Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.037252 5008 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.037623 5008 trace.go:236] Trace[1080234229]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 22:38:55.732) (total time: 10305ms): Nov 26 22:39:06 crc kubenswrapper[5008]: Trace[1080234229]: ---"Objects listed" error: 10305ms (22:39:06.037) Nov 26 22:39:06 crc kubenswrapper[5008]: Trace[1080234229]: [10.30504555s] [10.30504555s] END Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.037644 5008 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.041509 5008 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.041536 5008 trace.go:236] Trace[991690113]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 22:38:52.999) (total time: 13042ms): Nov 26 22:39:06 crc kubenswrapper[5008]: Trace[991690113]: ---"Objects listed" error: 13042ms (22:39:06.041) Nov 26 22:39:06 crc kubenswrapper[5008]: Trace[991690113]: [13.042308333s] [13.042308333s] END Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.041624 5008 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.042690 5008 trace.go:236] Trace[658682316]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 22:38:54.709) (total time: 11333ms): Nov 26 22:39:06 crc kubenswrapper[5008]: Trace[658682316]: ---"Objects listed" error: 11332ms (22:39:06.042) Nov 26 22:39:06 crc kubenswrapper[5008]: Trace[658682316]: [11.333068056s] [11.333068056s] END Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.042741 5008 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.046797 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.055017 5008 csr.go:261] certificate signing request csr-6ndnx is approved, waiting to be issued Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.065699 5008 csr.go:257] certificate signing request csr-6ndnx is issued Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.070081 5008 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37984->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.070199 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37984->192.168.126.11:17697: read: connection reset by peer" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.349522 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.354014 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.441729 5008 apiserver.go:52] "Watching apiserver" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.443830 5008 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.444134 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.444471 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.444480 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.444529 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.444556 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.444656 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.444745 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.444818 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.444844 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.444898 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.446249 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.447222 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.447237 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.449434 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.449463 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.449535 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.449661 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.449709 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.449782 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.456293 5008 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.473943 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.485438 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.500555 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.511450 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.520407 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.527937 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.536810 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545581 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545630 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545653 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545679 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545701 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545722 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545741 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545767 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545789 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545807 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545825 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545846 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545866 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545886 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545905 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545924 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545947 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545986 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545985 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546006 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546029 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546051 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546074 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546096 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546121 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546146 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546166 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546187 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546210 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546231 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546252 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546273 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546296 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546341 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546362 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546384 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546405 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546426 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546450 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546469 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546492 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546512 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546531 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546554 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546582 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546639 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546658 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546737 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546758 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546778 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546799 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546824 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546847 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546871 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546895 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546919 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546947 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546985 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547009 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547034 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547060 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547085 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547109 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547139 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547165 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547186 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547234 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547345 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547374 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547396 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547424 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547446 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547469 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547488 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547537 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547559 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547578 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547597 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547618 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547639 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547666 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547688 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547717 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547741 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547762 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.545991 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547786 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546088 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546193 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546360 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546640 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546689 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546738 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546739 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546760 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546758 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546876 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546911 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.546946 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547068 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547913 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547916 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547079 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547177 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547259 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548022 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547247 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547311 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548051 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547348 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547389 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547404 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547487 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547534 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547549 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547577 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547690 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547701 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547731 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547757 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547867 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547285 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548168 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548118 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548174 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548279 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548338 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548372 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.547813 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548428 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548463 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548482 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548500 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548522 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548545 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548566 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548477 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548588 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548564 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548607 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548632 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548652 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548676 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548696 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548717 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548738 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548762 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548782 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548802 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548822 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548844 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548864 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548884 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548908 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548929 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548949 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548986 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549007 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549096 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549115 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549141 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549158 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549176 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549203 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549219 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549236 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549253 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549276 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549297 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549317 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549333 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549349 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549365 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549393 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549409 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549431 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549447 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549466 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549484 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549500 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549517 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549534 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549549 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549564 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549589 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549605 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549594 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549622 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549699 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549724 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549751 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549773 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549795 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549816 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549832 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549850 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549867 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549926 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549947 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549985 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550002 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550019 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550036 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550055 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550071 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550088 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550105 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550122 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550139 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550157 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550173 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550190 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550206 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550224 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550241 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550257 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550275 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550291 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550341 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550360 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550378 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550401 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550437 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550459 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550476 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550491 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550512 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550527 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550542 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550558 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550573 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550589 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550605 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550623 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550639 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550655 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550671 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550687 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550705 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550722 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550739 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550776 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550793 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550818 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550839 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550858 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550876 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550893 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550913 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550930 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550949 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550989 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551009 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551029 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551051 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551109 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551121 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551132 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551142 5008 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551151 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551162 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551172 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551183 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551193 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551202 5008 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551223 5008 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551232 5008 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551248 5008 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551258 5008 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551267 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551277 5008 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551287 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551296 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551305 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551316 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551325 5008 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551335 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551347 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551357 5008 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551367 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551376 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551390 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551404 5008 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551418 5008 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551428 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551437 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551448 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551459 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551468 5008 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551478 5008 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551487 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551523 5008 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551535 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551547 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551558 5008 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551568 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551577 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551586 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551596 5008 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.552288 5008 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548646 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.548673 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549002 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549215 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549239 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549439 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549614 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.555235 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.555251 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.555290 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.555327 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.555325 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.555491 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549631 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549870 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549892 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549909 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.549946 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550250 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550272 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550382 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.550928 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551103 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551296 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551352 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551398 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551435 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551497 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551734 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551739 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551782 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551796 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551836 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.551892 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.552132 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.552208 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.552214 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.552567 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.552768 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.552794 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.553186 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.553265 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.553381 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:39:07.053334455 +0000 UTC m=+22.466028457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.555887 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556020 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556099 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556169 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556176 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556189 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.556209 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.556267 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:07.0562475 +0000 UTC m=+22.468941582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556318 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556314 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556322 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.553548 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.553639 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.553640 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.553660 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.553806 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.553844 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.553946 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.554151 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.554218 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.554249 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.554262 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.554326 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.554337 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.554573 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.554838 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.554923 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.554937 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.555126 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556410 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556447 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556625 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556802 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556843 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556873 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.556933 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.557113 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.553480 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.557167 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.557178 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.557184 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.557235 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.557291 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.557450 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.557794 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.557807 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.557845 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.557902 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.557905 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.558056 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.558086 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.558102 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.558211 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.558542 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.558620 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.558685 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.558793 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.559252 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.559283 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.561049 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.561081 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.561278 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.561578 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.561925 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.562047 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.562096 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:07.06208252 +0000 UTC m=+22.474776522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.562130 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.562163 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.562365 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.562548 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.562579 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.562591 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.562651 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.562756 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.562771 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.562794 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.562823 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.562833 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.562865 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.563403 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.563857 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.564160 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.564143 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.564415 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.564686 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.565115 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.565521 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.565904 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.566031 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.566256 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.566231 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.566447 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.566715 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.566907 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.567518 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.567726 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.567757 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.568008 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.568018 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.568276 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.568596 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.568791 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.568950 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.569013 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.570475 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.571547 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.572258 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.572439 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.572690 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.573052 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.573137 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.573155 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.573350 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.574787 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.575031 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.575044 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.575104 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:07.075087264 +0000 UTC m=+22.487781266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.581602 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.584101 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.587709 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.587766 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.587784 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.587889 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:07.087854449 +0000 UTC m=+22.500548541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.601708 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.602728 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.611372 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.612034 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652527 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652558 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652618 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652628 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652637 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652645 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652654 5008 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652662 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652672 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652680 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652688 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652696 5008 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652706 5008 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652716 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652729 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652742 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652752 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652763 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652774 5008 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652785 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652796 5008 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652804 5008 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652812 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652820 5008 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652829 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652836 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652844 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652853 5008 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652862 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652869 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652877 5008 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652885 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652894 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652903 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652911 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652920 5008 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652929 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652937 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652945 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652954 5008 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652977 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.652988 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653006 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653018 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653030 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653040 5008 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653051 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653062 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653072 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653082 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653092 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653102 5008 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653112 5008 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653124 5008 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653135 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653147 5008 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653147 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653158 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653196 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653201 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653234 5008 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653244 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653253 5008 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653262 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653270 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653279 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653287 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653295 5008 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653304 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653312 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653321 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653329 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653338 5008 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653345 5008 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653354 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653362 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653370 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653378 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653386 5008 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653394 5008 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653403 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653411 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653419 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653427 5008 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653435 5008 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653442 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653450 5008 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653458 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653466 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653474 5008 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653482 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653491 5008 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653500 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653508 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653516 5008 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653524 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653531 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653539 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653546 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653553 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653561 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653569 5008 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653577 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653585 5008 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653592 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653600 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653608 5008 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653615 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653623 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653631 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653639 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653648 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653657 5008 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653666 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653675 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653684 5008 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653693 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653700 5008 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653708 5008 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653716 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653723 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653730 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653799 5008 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653847 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653866 5008 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653887 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653904 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653922 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653941 5008 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.653982 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654000 5008 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654018 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654034 5008 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654051 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654067 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654085 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654102 5008 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654133 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654153 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654169 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654188 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654205 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654220 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654237 5008 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654286 5008 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654303 5008 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654319 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654335 5008 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654351 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654366 5008 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654382 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654398 5008 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654413 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.654431 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.670099 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.673420 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a" exitCode=255 Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.673484 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a"} Nov 26 22:39:06 crc kubenswrapper[5008]: E1126 22:39:06.682676 5008 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.685284 5008 scope.go:117] "RemoveContainer" containerID="b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.685447 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.686397 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.702476 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.719979 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.726459 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.738172 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.741283 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.750394 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.757496 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.760594 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.772638 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.772721 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 22:39:06 crc kubenswrapper[5008]: W1126 22:39:06.783400 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-c11abde4433d93189198996318fe3a337c14819608bb7f7fa97a8d6f92c73673 WatchSource:0}: Error finding container c11abde4433d93189198996318fe3a337c14819608bb7f7fa97a8d6f92c73673: Status 404 returned error can't find the container with id c11abde4433d93189198996318fe3a337c14819608bb7f7fa97a8d6f92c73673 Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.803911 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.812622 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 26 22:39:06 crc kubenswrapper[5008]: W1126 22:39:06.816899 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-d4a4f0b934bfb152c03f8d5abd0461bab32007af9bb2ed98ecdd56ee12d9e1e6 WatchSource:0}: Error finding container d4a4f0b934bfb152c03f8d5abd0461bab32007af9bb2ed98ecdd56ee12d9e1e6: Status 404 returned error can't find the container with id d4a4f0b934bfb152c03f8d5abd0461bab32007af9bb2ed98ecdd56ee12d9e1e6 Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.826471 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.850870 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.881653 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.893082 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.911757 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.928548 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.941336 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:06 crc kubenswrapper[5008]: I1126 22:39:06.952677 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.058639 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.058771 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.058836 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:39:08.058817171 +0000 UTC m=+23.471511173 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.058917 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.059015 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:08.058998247 +0000 UTC m=+23.471692249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.068874 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-11-26 22:34:06 +0000 UTC, rotation deadline is 2026-10-09 05:58:03.799437935 +0000 UTC Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.068939 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7591h18m56.730513755s for next certificate rotation Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.159331 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.159377 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.159438 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.159554 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.159614 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:08.15959801 +0000 UTC m=+23.572292012 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.159646 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.159668 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.159672 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.159710 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.159684 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.159723 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.159781 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:08.159769116 +0000 UTC m=+23.572463118 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.159804 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:08.159793957 +0000 UTC m=+23.572487959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.266122 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.275927 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.287435 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.296458 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.303833 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.312611 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.320586 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.330730 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.356050 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.368018 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.439867 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-4qkmj"] Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.440462 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8x546"] Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.440603 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.440880 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8x546" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.442387 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.442490 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.442570 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.444120 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.450282 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.450703 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.450708 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.452297 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.476488 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.489288 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.502223 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.514614 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.520803 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.521419 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.522611 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.523309 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.524325 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.524871 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.525468 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.526465 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.527197 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.527193 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.528284 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.528833 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.529933 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.530548 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.531134 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.532133 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.532674 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.533780 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.534231 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.534832 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.535856 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.536388 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.537356 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.537895 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.539901 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.540388 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.541028 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.542311 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.543110 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.544211 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.544873 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.545486 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.546780 5008 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.547830 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.549739 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.550685 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.551141 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.552699 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.553402 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.556405 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.556609 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.557310 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.558575 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.559153 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.560219 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.560856 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.562000 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.562312 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s6mv\" (UniqueName: \"kubernetes.io/projected/57d86641-6721-4f54-bc04-f188d8d13079-kube-api-access-6s6mv\") pod \"node-resolver-8x546\" (UID: \"57d86641-6721-4f54-bc04-f188d8d13079\") " pod="openshift-dns/node-resolver-8x546" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.562365 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e558d58-c5ad-41f5-930f-36ac26b1a1ea-rootfs\") pod \"machine-config-daemon-4qkmj\" (UID: \"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\") " pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.562408 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e558d58-c5ad-41f5-930f-36ac26b1a1ea-mcd-auth-proxy-config\") pod \"machine-config-daemon-4qkmj\" (UID: \"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\") " pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.562430 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkrlb\" (UniqueName: \"kubernetes.io/projected/8e558d58-c5ad-41f5-930f-36ac26b1a1ea-kube-api-access-mkrlb\") pod \"machine-config-daemon-4qkmj\" (UID: \"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\") " pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.562469 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e558d58-c5ad-41f5-930f-36ac26b1a1ea-proxy-tls\") pod \"machine-config-daemon-4qkmj\" (UID: \"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\") " pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.562488 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/57d86641-6721-4f54-bc04-f188d8d13079-hosts-file\") pod \"node-resolver-8x546\" (UID: \"57d86641-6721-4f54-bc04-f188d8d13079\") " pod="openshift-dns/node-resolver-8x546" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.562760 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.563724 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.564352 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.565566 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.565627 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.566190 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.567181 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.567687 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.568695 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.569382 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.569886 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.575513 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.583975 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.591899 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.599362 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.607086 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.623877 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.641113 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.663217 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s6mv\" (UniqueName: \"kubernetes.io/projected/57d86641-6721-4f54-bc04-f188d8d13079-kube-api-access-6s6mv\") pod \"node-resolver-8x546\" (UID: \"57d86641-6721-4f54-bc04-f188d8d13079\") " pod="openshift-dns/node-resolver-8x546" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.663255 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkrlb\" (UniqueName: \"kubernetes.io/projected/8e558d58-c5ad-41f5-930f-36ac26b1a1ea-kube-api-access-mkrlb\") pod \"machine-config-daemon-4qkmj\" (UID: \"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\") " pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.663286 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e558d58-c5ad-41f5-930f-36ac26b1a1ea-rootfs\") pod \"machine-config-daemon-4qkmj\" (UID: \"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\") " pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.663308 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e558d58-c5ad-41f5-930f-36ac26b1a1ea-mcd-auth-proxy-config\") pod \"machine-config-daemon-4qkmj\" (UID: \"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\") " pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.663339 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/57d86641-6721-4f54-bc04-f188d8d13079-hosts-file\") pod \"node-resolver-8x546\" (UID: \"57d86641-6721-4f54-bc04-f188d8d13079\") " pod="openshift-dns/node-resolver-8x546" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.663362 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e558d58-c5ad-41f5-930f-36ac26b1a1ea-proxy-tls\") pod \"machine-config-daemon-4qkmj\" (UID: \"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\") " pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.663583 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/57d86641-6721-4f54-bc04-f188d8d13079-hosts-file\") pod \"node-resolver-8x546\" (UID: \"57d86641-6721-4f54-bc04-f188d8d13079\") " pod="openshift-dns/node-resolver-8x546" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.664367 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e558d58-c5ad-41f5-930f-36ac26b1a1ea-mcd-auth-proxy-config\") pod \"machine-config-daemon-4qkmj\" (UID: \"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\") " pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.664478 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e558d58-c5ad-41f5-930f-36ac26b1a1ea-rootfs\") pod \"machine-config-daemon-4qkmj\" (UID: \"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\") " pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.670913 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e558d58-c5ad-41f5-930f-36ac26b1a1ea-proxy-tls\") pod \"machine-config-daemon-4qkmj\" (UID: \"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\") " pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.671285 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.684027 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071"} Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.684094 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d4a4f0b934bfb152c03f8d5abd0461bab32007af9bb2ed98ecdd56ee12d9e1e6"} Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.684810 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c11abde4433d93189198996318fe3a337c14819608bb7f7fa97a8d6f92c73673"} Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.686260 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.687808 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c"} Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.688066 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.689231 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841"} Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.689268 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e"} Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.689280 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3849404997f986c0c3692448b4480b127e6c7093c398e75c901aa6403c924e93"} Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.691071 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.692081 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.696432 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkrlb\" (UniqueName: \"kubernetes.io/projected/8e558d58-c5ad-41f5-930f-36ac26b1a1ea-kube-api-access-mkrlb\") pod \"machine-config-daemon-4qkmj\" (UID: \"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\") " pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.704695 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s6mv\" (UniqueName: \"kubernetes.io/projected/57d86641-6721-4f54-bc04-f188d8d13079-kube-api-access-6s6mv\") pod \"node-resolver-8x546\" (UID: \"57d86641-6721-4f54-bc04-f188d8d13079\") " pod="openshift-dns/node-resolver-8x546" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.721331 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.742051 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.754180 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.755238 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.762310 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8x546" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.764360 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.772239 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.785488 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.796791 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.812021 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-ftgz4"] Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.812711 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.814122 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-r4xtd"] Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.814295 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zpbmz"] Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.815007 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.815268 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.816117 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.816261 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.816405 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.816502 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.816514 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 26 22:39:07 crc kubenswrapper[5008]: W1126 22:39:07.819066 5008 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.819095 5008 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.819147 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.819217 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.819333 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.819415 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.819442 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 26 22:39:07 crc kubenswrapper[5008]: W1126 22:39:07.819555 5008 reflector.go:561] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.819576 5008 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.819587 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 26 22:39:07 crc kubenswrapper[5008]: W1126 22:39:07.819606 5008 reflector.go:561] object-"openshift-multus"/"multus-daemon-config": failed to list *v1.ConfigMap: configmaps "multus-daemon-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Nov 26 22:39:07 crc kubenswrapper[5008]: E1126 22:39:07.819617 5008 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-daemon-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"multus-daemon-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.824391 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.837089 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.853276 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.868940 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-var-lib-cni-bin\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.868996 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-var-lib-kubelet\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869021 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-kubelet\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869041 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovn-node-metrics-cert\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869063 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-daemon-config\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869084 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff27df38-53d5-442d-a931-90a20311879c-cnibin\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869105 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prhq6\" (UniqueName: \"kubernetes.io/projected/ff27df38-53d5-442d-a931-90a20311879c-kube-api-access-prhq6\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869127 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-log-socket\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869147 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-cni-dir\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869178 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-system-cni-dir\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869199 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-openvswitch\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869220 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sf7w\" (UniqueName: \"kubernetes.io/projected/41e5d1a8-86e0-42e2-a446-8f8938091dc1-kube-api-access-4sf7w\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869251 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-cni-netd\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869271 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff27df38-53d5-442d-a931-90a20311879c-cni-binary-copy\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869289 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-socket-dir-parent\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869303 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-hostroot\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869317 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-run-netns\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869330 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8509b0e0-c914-44a1-a657-ffb4f5a86c18-cni-binary-copy\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869360 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-var-lib-cni-multus\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869374 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jskx\" (UniqueName: \"kubernetes.io/projected/8509b0e0-c914-44a1-a657-ffb4f5a86c18-kube-api-access-6jskx\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869400 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-var-lib-openvswitch\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869426 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-os-release\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869444 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-slash\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869458 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-systemd\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869472 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-ovn\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869485 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-run-ovn-kubernetes\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869498 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-run-netns\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869510 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-conf-dir\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869526 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-systemd-units\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869540 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-etc-openvswitch\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869555 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ff27df38-53d5-442d-a931-90a20311879c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869571 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-cni-bin\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869585 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869600 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovnkube-config\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869613 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-run-k8s-cni-cncf-io\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869628 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-run-multus-certs\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869650 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff27df38-53d5-442d-a931-90a20311879c-os-release\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869664 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-env-overrides\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869684 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff27df38-53d5-442d-a931-90a20311879c-system-cni-dir\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869697 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff27df38-53d5-442d-a931-90a20311879c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869711 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-node-log\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869725 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovnkube-script-lib\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869737 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-cnibin\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.869751 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-etc-kubernetes\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.874213 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.894264 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.903313 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.912079 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.920473 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.929258 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.968564 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970710 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-kubelet\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970739 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovn-node-metrics-cert\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970756 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-daemon-config\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970770 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-cni-dir\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970786 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff27df38-53d5-442d-a931-90a20311879c-cnibin\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970801 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prhq6\" (UniqueName: \"kubernetes.io/projected/ff27df38-53d5-442d-a931-90a20311879c-kube-api-access-prhq6\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970817 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-log-socket\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970831 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-system-cni-dir\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970865 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sf7w\" (UniqueName: \"kubernetes.io/projected/41e5d1a8-86e0-42e2-a446-8f8938091dc1-kube-api-access-4sf7w\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970879 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-openvswitch\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970894 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-cni-netd\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970918 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff27df38-53d5-442d-a931-90a20311879c-cni-binary-copy\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970932 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-socket-dir-parent\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970947 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-hostroot\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970982 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-var-lib-cni-multus\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.970979 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-kubelet\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971005 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jskx\" (UniqueName: \"kubernetes.io/projected/8509b0e0-c914-44a1-a657-ffb4f5a86c18-kube-api-access-6jskx\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971023 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-run-netns\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971020 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-log-socket\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971045 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8509b0e0-c914-44a1-a657-ffb4f5a86c18-cni-binary-copy\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971054 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff27df38-53d5-442d-a931-90a20311879c-cnibin\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971070 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-var-lib-openvswitch\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971101 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-os-release\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971125 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-slash\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971146 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-systemd\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971150 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-system-cni-dir\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971154 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-cni-dir\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971166 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-ovn\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971176 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-socket-dir-parent\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971203 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-ovn\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971099 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-openvswitch\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971248 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-var-lib-openvswitch\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971204 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-run-ovn-kubernetes\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971225 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-run-ovn-kubernetes\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971249 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-slash\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971273 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-systemd\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971218 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-cni-netd\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971360 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-run-netns\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971406 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-var-lib-cni-multus\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971411 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-conf-dir\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971432 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-hostroot\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971445 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-etc-openvswitch\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971478 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-conf-dir\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971488 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-etc-openvswitch\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971496 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-run-netns\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971521 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-run-netns\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971548 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-systemd-units\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971585 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971626 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-systemd-units\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971633 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971700 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovnkube-config\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971733 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-run-k8s-cni-cncf-io\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971746 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-os-release\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971768 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-run-multus-certs\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971787 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-run-k8s-cni-cncf-io\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971860 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-run-multus-certs\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971869 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ff27df38-53d5-442d-a931-90a20311879c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971883 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff27df38-53d5-442d-a931-90a20311879c-cni-binary-copy\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971908 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-cni-bin\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971933 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff27df38-53d5-442d-a931-90a20311879c-os-release\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971956 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-cni-bin\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.971989 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-env-overrides\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972015 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff27df38-53d5-442d-a931-90a20311879c-os-release\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972012 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8509b0e0-c914-44a1-a657-ffb4f5a86c18-cni-binary-copy\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972037 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff27df38-53d5-442d-a931-90a20311879c-system-cni-dir\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972061 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff27df38-53d5-442d-a931-90a20311879c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972078 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-node-log\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972101 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-etc-kubernetes\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972123 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovnkube-script-lib\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972146 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-cnibin\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972194 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-var-lib-cni-bin\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972215 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-var-lib-kubelet\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972224 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-etc-kubernetes\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972268 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-var-lib-kubelet\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972274 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-node-log\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972315 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-cnibin\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972316 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8509b0e0-c914-44a1-a657-ffb4f5a86c18-host-var-lib-cni-bin\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972356 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovnkube-config\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972367 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff27df38-53d5-442d-a931-90a20311879c-system-cni-dir\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972373 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ff27df38-53d5-442d-a931-90a20311879c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972541 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff27df38-53d5-442d-a931-90a20311879c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972574 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-env-overrides\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.972862 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovnkube-script-lib\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:07 crc kubenswrapper[5008]: I1126 22:39:07.975562 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovn-node-metrics-cert\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.016977 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prhq6\" (UniqueName: \"kubernetes.io/projected/ff27df38-53d5-442d-a931-90a20311879c-kube-api-access-prhq6\") pod \"multus-additional-cni-plugins-ftgz4\" (UID: \"ff27df38-53d5-442d-a931-90a20311879c\") " pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.058392 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jskx\" (UniqueName: \"kubernetes.io/projected/8509b0e0-c914-44a1-a657-ffb4f5a86c18-kube-api-access-6jskx\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.070433 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.072799 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.072950 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:39:10.072921974 +0000 UTC m=+25.485615966 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.073022 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.073155 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.073215 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:10.073204862 +0000 UTC m=+25.485898864 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.108538 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.154584 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.155004 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:08 crc kubenswrapper[5008]: W1126 22:39:08.164703 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff27df38_53d5_442d_a931_90a20311879c.slice/crio-8d7156355de27d2a888eb64bcccfe2ef2b74230dd8900fa5fb2d8b13c622e64a WatchSource:0}: Error finding container 8d7156355de27d2a888eb64bcccfe2ef2b74230dd8900fa5fb2d8b13c622e64a: Status 404 returned error can't find the container with id 8d7156355de27d2a888eb64bcccfe2ef2b74230dd8900fa5fb2d8b13c622e64a Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.173704 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.173770 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.173830 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.173919 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.173946 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.173978 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.174001 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.174024 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.174036 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:10.174014612 +0000 UTC m=+25.586708684 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.174041 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.174111 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.174126 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:10.174103705 +0000 UTC m=+25.586797747 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.174204 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:10.174183827 +0000 UTC m=+25.586877859 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.199701 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.229069 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.270391 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.310377 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.348829 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.387721 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.440335 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.473162 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.513476 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.517761 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.517909 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.517782 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.518005 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.517766 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.518052 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.693407 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerStarted","Data":"e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c"} Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.693450 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerStarted","Data":"c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574"} Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.693460 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerStarted","Data":"68643c85231e3dadb7282d0db1ed20e541aa7806f93a1f5b1efbedc6d979d9e8"} Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.694415 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" event={"ID":"ff27df38-53d5-442d-a931-90a20311879c","Type":"ContainerStarted","Data":"a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78"} Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.694438 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" event={"ID":"ff27df38-53d5-442d-a931-90a20311879c","Type":"ContainerStarted","Data":"8d7156355de27d2a888eb64bcccfe2ef2b74230dd8900fa5fb2d8b13c622e64a"} Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.695528 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8x546" event={"ID":"57d86641-6721-4f54-bc04-f188d8d13079","Type":"ContainerStarted","Data":"0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957"} Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.695573 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8x546" event={"ID":"57d86641-6721-4f54-bc04-f188d8d13079","Type":"ContainerStarted","Data":"cafba9b30b0a743fff8710fe1b1174d049e08ad18f17370fb5c2cfa89b1703d8"} Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.706239 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.716073 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.732885 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.744152 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.758822 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.774791 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.775018 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.791657 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sf7w\" (UniqueName: \"kubernetes.io/projected/41e5d1a8-86e0-42e2-a446-8f8938091dc1-kube-api-access-4sf7w\") pod \"ovnkube-node-zpbmz\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.811252 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.861017 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.863304 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.916250 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.971378 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.971652 5008 configmap.go:193] Couldn't get configMap openshift-multus/multus-daemon-config: failed to sync configmap cache: timed out waiting for the condition Nov 26 22:39:08 crc kubenswrapper[5008]: E1126 22:39:08.971742 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-daemon-config podName:8509b0e0-c914-44a1-a657-ffb4f5a86c18 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:09.471719105 +0000 UTC m=+24.884413107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "multus-daemon-config" (UniqueName: "kubernetes.io/configmap/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-daemon-config") pod "multus-r4xtd" (UID: "8509b0e0-c914-44a1-a657-ffb4f5a86c18") : failed to sync configmap cache: timed out waiting for the condition Nov 26 22:39:08 crc kubenswrapper[5008]: I1126 22:39:08.994823 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:08Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.038940 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.070386 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.075675 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:09 crc kubenswrapper[5008]: W1126 22:39:09.088471 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41e5d1a8_86e0_42e2_a446_8f8938091dc1.slice/crio-d174e19a6a937007481b98f22da236d13a22050fd1ec28bad2fcd4fc0d55660f WatchSource:0}: Error finding container d174e19a6a937007481b98f22da236d13a22050fd1ec28bad2fcd4fc0d55660f: Status 404 returned error can't find the container with id d174e19a6a937007481b98f22da236d13a22050fd1ec28bad2fcd4fc0d55660f Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.113103 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.153651 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.191944 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.247188 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.280399 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.313229 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.315572 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-cq57l"] Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.315902 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cq57l" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.343845 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.363114 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.383770 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.385323 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39db8495-eee6-4117-b69a-aca4f98eb640-host\") pod \"node-ca-cq57l\" (UID: \"39db8495-eee6-4117-b69a-aca4f98eb640\") " pod="openshift-image-registry/node-ca-cq57l" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.385419 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/39db8495-eee6-4117-b69a-aca4f98eb640-serviceca\") pod \"node-ca-cq57l\" (UID: \"39db8495-eee6-4117-b69a-aca4f98eb640\") " pod="openshift-image-registry/node-ca-cq57l" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.385477 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t288w\" (UniqueName: \"kubernetes.io/projected/39db8495-eee6-4117-b69a-aca4f98eb640-kube-api-access-t288w\") pod \"node-ca-cq57l\" (UID: \"39db8495-eee6-4117-b69a-aca4f98eb640\") " pod="openshift-image-registry/node-ca-cq57l" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.403796 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.432336 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.443410 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.486642 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39db8495-eee6-4117-b69a-aca4f98eb640-host\") pod \"node-ca-cq57l\" (UID: \"39db8495-eee6-4117-b69a-aca4f98eb640\") " pod="openshift-image-registry/node-ca-cq57l" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.486707 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-daemon-config\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.486737 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/39db8495-eee6-4117-b69a-aca4f98eb640-serviceca\") pod \"node-ca-cq57l\" (UID: \"39db8495-eee6-4117-b69a-aca4f98eb640\") " pod="openshift-image-registry/node-ca-cq57l" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.486760 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t288w\" (UniqueName: \"kubernetes.io/projected/39db8495-eee6-4117-b69a-aca4f98eb640-kube-api-access-t288w\") pod \"node-ca-cq57l\" (UID: \"39db8495-eee6-4117-b69a-aca4f98eb640\") " pod="openshift-image-registry/node-ca-cq57l" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.486761 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39db8495-eee6-4117-b69a-aca4f98eb640-host\") pod \"node-ca-cq57l\" (UID: \"39db8495-eee6-4117-b69a-aca4f98eb640\") " pod="openshift-image-registry/node-ca-cq57l" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.487366 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8509b0e0-c914-44a1-a657-ffb4f5a86c18-multus-daemon-config\") pod \"multus-r4xtd\" (UID: \"8509b0e0-c914-44a1-a657-ffb4f5a86c18\") " pod="openshift-multus/multus-r4xtd" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.488017 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/39db8495-eee6-4117-b69a-aca4f98eb640-serviceca\") pod \"node-ca-cq57l\" (UID: \"39db8495-eee6-4117-b69a-aca4f98eb640\") " pod="openshift-image-registry/node-ca-cq57l" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.493325 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.519890 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t288w\" (UniqueName: \"kubernetes.io/projected/39db8495-eee6-4117-b69a-aca4f98eb640-kube-api-access-t288w\") pod \"node-ca-cq57l\" (UID: \"39db8495-eee6-4117-b69a-aca4f98eb640\") " pod="openshift-image-registry/node-ca-cq57l" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.552290 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.611485 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.626740 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cq57l" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.639919 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.671916 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.684875 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r4xtd" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.704697 5008 generic.go:334] "Generic (PLEG): container finished" podID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerID="b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586" exitCode=0 Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.704773 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586"} Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.704807 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerStarted","Data":"d174e19a6a937007481b98f22da236d13a22050fd1ec28bad2fcd4fc0d55660f"} Nov 26 22:39:09 crc kubenswrapper[5008]: W1126 22:39:09.707902 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8509b0e0_c914_44a1_a657_ffb4f5a86c18.slice/crio-0e09fe41dc1c9b063379d6189d34dfd66efc69297e0ab34f4296bd96b8390bd0 WatchSource:0}: Error finding container 0e09fe41dc1c9b063379d6189d34dfd66efc69297e0ab34f4296bd96b8390bd0: Status 404 returned error can't find the container with id 0e09fe41dc1c9b063379d6189d34dfd66efc69297e0ab34f4296bd96b8390bd0 Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.709716 5008 generic.go:334] "Generic (PLEG): container finished" podID="ff27df38-53d5-442d-a931-90a20311879c" containerID="a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78" exitCode=0 Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.709748 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" event={"ID":"ff27df38-53d5-442d-a931-90a20311879c","Type":"ContainerDied","Data":"a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78"} Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.712797 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f"} Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.713727 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cq57l" event={"ID":"39db8495-eee6-4117-b69a-aca4f98eb640","Type":"ContainerStarted","Data":"7cde49bb27505e95d315b600fcc269eebbf0d8543d04bf1e9c759a4c83971cbf"} Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.717872 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.753622 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.799592 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.832650 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.873473 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.919831 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.950942 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:09 crc kubenswrapper[5008]: I1126 22:39:09.990623 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:09Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.035059 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.071827 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.097981 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.098108 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:39:14.098090687 +0000 UTC m=+29.510784689 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.098172 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.098247 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.098282 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:14.098275753 +0000 UTC m=+29.510969755 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.130053 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.156646 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.191689 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.199141 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.199183 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.199217 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.199337 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.199354 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.199353 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.199388 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.199401 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.199411 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.199364 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.199452 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:14.199437969 +0000 UTC m=+29.612131971 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.199465 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:14.19946042 +0000 UTC m=+29.612154422 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.199475 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:14.19947046 +0000 UTC m=+29.612164462 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.240069 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.271793 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.323124 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.353594 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.396187 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.518163 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.518244 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.518611 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.518479 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.518259 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:10 crc kubenswrapper[5008]: E1126 22:39:10.518703 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.722009 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4xtd" event={"ID":"8509b0e0-c914-44a1-a657-ffb4f5a86c18","Type":"ContainerStarted","Data":"56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625"} Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.722059 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4xtd" event={"ID":"8509b0e0-c914-44a1-a657-ffb4f5a86c18","Type":"ContainerStarted","Data":"0e09fe41dc1c9b063379d6189d34dfd66efc69297e0ab34f4296bd96b8390bd0"} Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.725676 5008 generic.go:334] "Generic (PLEG): container finished" podID="ff27df38-53d5-442d-a931-90a20311879c" containerID="7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77" exitCode=0 Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.725839 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" event={"ID":"ff27df38-53d5-442d-a931-90a20311879c","Type":"ContainerDied","Data":"7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77"} Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.728864 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cq57l" event={"ID":"39db8495-eee6-4117-b69a-aca4f98eb640","Type":"ContainerStarted","Data":"681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024"} Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.735831 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerStarted","Data":"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f"} Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.735885 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerStarted","Data":"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e"} Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.735899 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerStarted","Data":"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523"} Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.735911 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerStarted","Data":"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c"} Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.735923 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerStarted","Data":"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634"} Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.746738 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.767442 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.782167 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.798134 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.814873 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.827089 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.846936 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.862355 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.875459 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.886933 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.898097 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.910913 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.925809 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.952038 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:10 crc kubenswrapper[5008]: I1126 22:39:10.992827 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:10Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.031222 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.072659 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.114070 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.155356 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.194721 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.243688 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.282877 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.312624 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.355166 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.392179 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.431746 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.470401 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.513617 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.553777 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.598933 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.741633 5008 generic.go:334] "Generic (PLEG): container finished" podID="ff27df38-53d5-442d-a931-90a20311879c" containerID="d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62" exitCode=0 Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.741699 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" event={"ID":"ff27df38-53d5-442d-a931-90a20311879c","Type":"ContainerDied","Data":"d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62"} Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.747472 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerStarted","Data":"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac"} Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.760874 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.777112 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.795749 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.811355 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.837676 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.869157 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.920499 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.935580 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.952933 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:11 crc kubenswrapper[5008]: I1126 22:39:11.992728 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:11Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.031545 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.072331 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.116805 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.152832 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.204172 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.446931 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.449052 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.449111 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.449130 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.449297 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.457508 5008 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.457716 5008 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.458660 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.458698 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.458708 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.458724 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.458735 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:12Z","lastTransitionTime":"2025-11-26T22:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:12 crc kubenswrapper[5008]: E1126 22:39:12.477810 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.482561 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.482619 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.482632 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.482651 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.482662 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:12Z","lastTransitionTime":"2025-11-26T22:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:12 crc kubenswrapper[5008]: E1126 22:39:12.495217 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.498524 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.498558 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.498566 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.498580 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.498590 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:12Z","lastTransitionTime":"2025-11-26T22:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.518343 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.518352 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.518447 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:12 crc kubenswrapper[5008]: E1126 22:39:12.518563 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:12 crc kubenswrapper[5008]: E1126 22:39:12.518733 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:12 crc kubenswrapper[5008]: E1126 22:39:12.518845 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:12 crc kubenswrapper[5008]: E1126 22:39:12.519852 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.523253 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.523286 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.523294 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.523308 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.523318 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:12Z","lastTransitionTime":"2025-11-26T22:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:12 crc kubenswrapper[5008]: E1126 22:39:12.536898 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.540182 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.540215 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.540224 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.540237 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.540245 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:12Z","lastTransitionTime":"2025-11-26T22:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:12 crc kubenswrapper[5008]: E1126 22:39:12.553304 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: E1126 22:39:12.553764 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.558315 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.558340 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.558351 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.558362 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.558371 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:12Z","lastTransitionTime":"2025-11-26T22:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.661548 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.661587 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.661599 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.661632 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.661644 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:12Z","lastTransitionTime":"2025-11-26T22:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.754747 5008 generic.go:334] "Generic (PLEG): container finished" podID="ff27df38-53d5-442d-a931-90a20311879c" containerID="3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a" exitCode=0 Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.754812 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" event={"ID":"ff27df38-53d5-442d-a931-90a20311879c","Type":"ContainerDied","Data":"3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a"} Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.763471 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.763541 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.763565 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.763598 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.763622 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:12Z","lastTransitionTime":"2025-11-26T22:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.774062 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.794256 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.818299 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.840224 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.856279 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.865838 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.865881 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.865894 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.866508 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.866555 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:12Z","lastTransitionTime":"2025-11-26T22:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.868086 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.877887 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.899628 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.911369 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.930048 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.941915 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.955247 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.966597 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.968264 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.968290 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.968299 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.968315 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.968325 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:12Z","lastTransitionTime":"2025-11-26T22:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.976895 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:12 crc kubenswrapper[5008]: I1126 22:39:12.987653 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:12Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.071197 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.071253 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.071272 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.071293 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.071307 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:13Z","lastTransitionTime":"2025-11-26T22:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.174367 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.174426 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.174449 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.174483 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.174504 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:13Z","lastTransitionTime":"2025-11-26T22:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.277495 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.277744 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.277808 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.277877 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.277951 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:13Z","lastTransitionTime":"2025-11-26T22:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.381020 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.381402 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.381558 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.381711 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.381844 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:13Z","lastTransitionTime":"2025-11-26T22:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.485053 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.485125 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.485138 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.485153 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.485164 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:13Z","lastTransitionTime":"2025-11-26T22:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.589008 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.589343 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.589355 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.589372 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.589383 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:13Z","lastTransitionTime":"2025-11-26T22:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.691849 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.691909 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.691925 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.691946 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.691982 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:13Z","lastTransitionTime":"2025-11-26T22:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.760447 5008 generic.go:334] "Generic (PLEG): container finished" podID="ff27df38-53d5-442d-a931-90a20311879c" containerID="a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9" exitCode=0 Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.760517 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" event={"ID":"ff27df38-53d5-442d-a931-90a20311879c","Type":"ContainerDied","Data":"a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9"} Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.766819 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerStarted","Data":"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591"} Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.789897 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.794242 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.794280 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.794292 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.794308 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.794318 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:13Z","lastTransitionTime":"2025-11-26T22:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.807528 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.821914 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.834144 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.845577 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.859927 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.873551 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.886958 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.896883 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.896924 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.896934 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.896950 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.896974 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:13Z","lastTransitionTime":"2025-11-26T22:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.900032 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.916909 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.929193 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.944138 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.954899 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.974093 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.987006 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:13Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.999776 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.999814 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.999825 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.999886 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:13 crc kubenswrapper[5008]: I1126 22:39:13.999900 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:13Z","lastTransitionTime":"2025-11-26T22:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.103293 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.103340 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.103358 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.103375 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.103389 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:14Z","lastTransitionTime":"2025-11-26T22:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.138096 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.138216 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:39:22.13819457 +0000 UTC m=+37.550888582 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.138251 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.138404 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.138458 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:22.138448278 +0000 UTC m=+37.551142280 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.206391 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.206461 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.206486 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.206518 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.206545 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:14Z","lastTransitionTime":"2025-11-26T22:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.239430 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.239500 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.239560 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.239701 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.239721 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.239749 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.239766 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.239776 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.239800 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.239821 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.239867 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:22.239826511 +0000 UTC m=+37.652520553 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.239913 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:22.239884393 +0000 UTC m=+37.652578465 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.240004 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:22.239936745 +0000 UTC m=+37.652630797 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.309367 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.309439 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.309463 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.309493 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.309517 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:14Z","lastTransitionTime":"2025-11-26T22:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.412585 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.413039 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.413055 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.413075 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.413090 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:14Z","lastTransitionTime":"2025-11-26T22:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.520149 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.520193 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.520267 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.521528 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.521573 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:14 crc kubenswrapper[5008]: E1126 22:39:14.521665 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.521702 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.521744 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.521766 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.521793 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.521814 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:14Z","lastTransitionTime":"2025-11-26T22:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.625243 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.625285 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.625294 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.625309 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.625319 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:14Z","lastTransitionTime":"2025-11-26T22:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.728416 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.728464 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.728475 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.728495 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.728507 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:14Z","lastTransitionTime":"2025-11-26T22:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.773515 5008 generic.go:334] "Generic (PLEG): container finished" podID="ff27df38-53d5-442d-a931-90a20311879c" containerID="29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883" exitCode=0 Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.773575 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" event={"ID":"ff27df38-53d5-442d-a931-90a20311879c","Type":"ContainerDied","Data":"29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883"} Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.801125 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.818150 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.830928 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.830992 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.831018 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.831035 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.831056 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:14Z","lastTransitionTime":"2025-11-26T22:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.836114 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.848251 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.859126 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.869898 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.882350 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.893194 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.906569 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.919099 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.932303 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.935115 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.935147 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.935158 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.935173 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.935184 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:14Z","lastTransitionTime":"2025-11-26T22:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.944415 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.955902 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.965664 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:14 crc kubenswrapper[5008]: I1126 22:39:14.982640 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.037166 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.037191 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.037200 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.037212 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.037220 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:15Z","lastTransitionTime":"2025-11-26T22:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.138922 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.139258 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.139267 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.139284 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.139293 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:15Z","lastTransitionTime":"2025-11-26T22:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.241734 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.241792 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.241815 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.241860 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.241884 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:15Z","lastTransitionTime":"2025-11-26T22:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.261385 5008 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.345989 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.346055 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.346071 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.346100 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.346116 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:15Z","lastTransitionTime":"2025-11-26T22:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.448612 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.448649 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.448658 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.448675 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.448686 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:15Z","lastTransitionTime":"2025-11-26T22:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.540872 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.552449 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.552520 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.552538 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.552565 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.552586 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:15Z","lastTransitionTime":"2025-11-26T22:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.560882 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.578824 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.596942 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.619227 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.632564 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.651236 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.655077 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.655124 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.655135 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.655151 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.655161 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:15Z","lastTransitionTime":"2025-11-26T22:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.670437 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.693177 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.708013 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.723661 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.737510 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.753335 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.757050 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.757087 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.757099 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.757116 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.757128 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:15Z","lastTransitionTime":"2025-11-26T22:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.779192 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.787195 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" event={"ID":"ff27df38-53d5-442d-a931-90a20311879c","Type":"ContainerStarted","Data":"0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119"} Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.797463 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.798174 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerStarted","Data":"b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161"} Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.799423 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.810702 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.821486 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.828154 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.832301 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.847032 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.859087 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.859160 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.859179 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.859225 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.859245 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:15Z","lastTransitionTime":"2025-11-26T22:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.864122 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.877331 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.891477 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.906016 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.921902 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.934829 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.956015 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.961590 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.961646 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.961664 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.961689 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.961707 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:15Z","lastTransitionTime":"2025-11-26T22:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.975038 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:15 crc kubenswrapper[5008]: I1126 22:39:15.995028 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:15Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.009327 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.024495 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.040476 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.056653 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.063686 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.063726 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.063738 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.063760 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.063773 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:16Z","lastTransitionTime":"2025-11-26T22:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.078632 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.094712 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.125863 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.147027 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.166685 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.166740 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.166763 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.166788 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.166807 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:16Z","lastTransitionTime":"2025-11-26T22:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.183029 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.204162 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.225510 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.239924 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.250289 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.261593 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.269319 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.269507 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.269591 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.269678 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.269800 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:16Z","lastTransitionTime":"2025-11-26T22:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.277253 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.294092 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.308822 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.372721 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.372807 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.372835 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.372868 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.372893 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:16Z","lastTransitionTime":"2025-11-26T22:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.476318 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.476371 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.476389 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.476410 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.476424 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:16Z","lastTransitionTime":"2025-11-26T22:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.517616 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.517669 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.517708 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:16 crc kubenswrapper[5008]: E1126 22:39:16.517818 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:16 crc kubenswrapper[5008]: E1126 22:39:16.517927 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:16 crc kubenswrapper[5008]: E1126 22:39:16.518197 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.578652 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.578703 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.578715 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.578732 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.578744 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:16Z","lastTransitionTime":"2025-11-26T22:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.682306 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.682729 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.682896 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.683114 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.683279 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:16Z","lastTransitionTime":"2025-11-26T22:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.787104 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.787163 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.787180 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.787203 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.787221 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:16Z","lastTransitionTime":"2025-11-26T22:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.802230 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.802906 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.837858 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.859070 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.877512 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.890472 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.890563 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.890581 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.890604 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.890622 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:16Z","lastTransitionTime":"2025-11-26T22:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.894273 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.917746 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.938544 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.963772 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.990625 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:16Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.993057 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.993131 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.993157 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.993188 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:16 crc kubenswrapper[5008]: I1126 22:39:16.993211 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:16Z","lastTransitionTime":"2025-11-26T22:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.014886 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:17Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.040224 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:17Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.052429 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:17Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.072282 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:17Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.086368 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:17Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.095160 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.095219 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.095233 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.095250 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.095262 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:17Z","lastTransitionTime":"2025-11-26T22:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.104619 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:17Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.117873 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:17Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.132322 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:17Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.198404 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.198455 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.198479 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.198509 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.198531 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:17Z","lastTransitionTime":"2025-11-26T22:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.301266 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.301360 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.301379 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.301402 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.301421 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:17Z","lastTransitionTime":"2025-11-26T22:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.404483 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.404562 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.404583 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.404609 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.404630 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:17Z","lastTransitionTime":"2025-11-26T22:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.507307 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.507360 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.507383 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.507406 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.507423 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:17Z","lastTransitionTime":"2025-11-26T22:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.613352 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.614670 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.614714 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.614747 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.614769 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:17Z","lastTransitionTime":"2025-11-26T22:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.718229 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.718293 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.718311 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.718336 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.718353 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:17Z","lastTransitionTime":"2025-11-26T22:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.805777 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.821287 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.821335 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.821352 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.821377 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.821395 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:17Z","lastTransitionTime":"2025-11-26T22:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.925023 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.925093 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.925117 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.925151 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:17 crc kubenswrapper[5008]: I1126 22:39:17.925174 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:17Z","lastTransitionTime":"2025-11-26T22:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.028222 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.028402 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.028431 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.028463 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.028485 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:18Z","lastTransitionTime":"2025-11-26T22:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.131469 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.131563 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.131577 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.131602 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.131618 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:18Z","lastTransitionTime":"2025-11-26T22:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.234910 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.235061 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.235079 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.235104 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.235121 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:18Z","lastTransitionTime":"2025-11-26T22:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.337810 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.337874 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.337896 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.337925 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.337945 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:18Z","lastTransitionTime":"2025-11-26T22:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.441064 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.441122 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.441141 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.441165 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.441227 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:18Z","lastTransitionTime":"2025-11-26T22:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.517650 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.517714 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.517812 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:18 crc kubenswrapper[5008]: E1126 22:39:18.517920 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:18 crc kubenswrapper[5008]: E1126 22:39:18.518031 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:18 crc kubenswrapper[5008]: E1126 22:39:18.518085 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.544096 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.544145 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.544157 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.544176 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.544194 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:18Z","lastTransitionTime":"2025-11-26T22:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.646463 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.646499 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.646511 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.646529 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.646541 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:18Z","lastTransitionTime":"2025-11-26T22:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.749908 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.750037 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.750078 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.750115 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.750135 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:18Z","lastTransitionTime":"2025-11-26T22:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.808876 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.853135 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.853171 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.853180 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.853196 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.853205 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:18Z","lastTransitionTime":"2025-11-26T22:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.955258 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.955333 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.955348 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.955366 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:18 crc kubenswrapper[5008]: I1126 22:39:18.955378 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:18Z","lastTransitionTime":"2025-11-26T22:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.058467 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.058537 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.058558 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.058586 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.058604 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:19Z","lastTransitionTime":"2025-11-26T22:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.161119 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.161186 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.161204 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.161230 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.161247 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:19Z","lastTransitionTime":"2025-11-26T22:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.264245 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.264282 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.264293 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.264371 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.264386 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:19Z","lastTransitionTime":"2025-11-26T22:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.368026 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.368096 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.368120 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.368153 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.368179 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:19Z","lastTransitionTime":"2025-11-26T22:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.470600 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.470655 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.470675 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.470702 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.470724 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:19Z","lastTransitionTime":"2025-11-26T22:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.577028 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.577106 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.577126 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.577154 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.577177 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:19Z","lastTransitionTime":"2025-11-26T22:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.680451 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.680505 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.680523 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.680547 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.680564 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:19Z","lastTransitionTime":"2025-11-26T22:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.782898 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.782998 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.783017 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.783044 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.783118 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:19Z","lastTransitionTime":"2025-11-26T22:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.815585 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/0.log" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.820485 5008 generic.go:334] "Generic (PLEG): container finished" podID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerID="b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161" exitCode=1 Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.820556 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161"} Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.821805 5008 scope.go:117] "RemoveContainer" containerID="b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.842627 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:19Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.870098 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:19Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.886581 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.886720 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.886792 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.886828 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.886892 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:19Z","lastTransitionTime":"2025-11-26T22:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.888993 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:19Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.914398 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:19Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.937254 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:19Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.967619 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:19Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.989661 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:19Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.990844 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.990927 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.990946 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.990997 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:19 crc kubenswrapper[5008]: I1126 22:39:19.991016 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:19Z","lastTransitionTime":"2025-11-26T22:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.023461 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:19Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.377351 6327 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1126 22:39:19.377426 6327 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:19.377452 6327 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:19.377489 6327 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:19.377486 6327 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1126 22:39:19.377555 6327 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.378049 6327 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:19.378571 6327 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:19.378619 6327 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1126 22:39:19.378645 6327 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:19.378654 6327 factory.go:656] Stopping watch factory\\\\nI1126 22:39:19.378679 6327 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.049351 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.069431 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.088320 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.093137 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.093188 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.093200 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.093218 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.093230 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:20Z","lastTransitionTime":"2025-11-26T22:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.111793 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.131139 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.154653 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.170701 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.195352 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.195386 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.195399 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.195443 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.195457 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:20Z","lastTransitionTime":"2025-11-26T22:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.299088 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.299131 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.299145 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.299161 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.299174 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:20Z","lastTransitionTime":"2025-11-26T22:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.401984 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.402025 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.402033 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.402047 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.402056 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:20Z","lastTransitionTime":"2025-11-26T22:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.504513 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.504571 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.504587 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.504609 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.504624 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:20Z","lastTransitionTime":"2025-11-26T22:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.518007 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.518048 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.518048 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:20 crc kubenswrapper[5008]: E1126 22:39:20.518132 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:20 crc kubenswrapper[5008]: E1126 22:39:20.518289 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:20 crc kubenswrapper[5008]: E1126 22:39:20.518379 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.606858 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.606927 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.606940 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.607007 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.607027 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:20Z","lastTransitionTime":"2025-11-26T22:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.709631 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.709691 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.709708 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.709732 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.709747 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:20Z","lastTransitionTime":"2025-11-26T22:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.730495 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv"] Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.731035 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.735306 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.737559 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.744338 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.758628 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.774247 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.787248 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.805105 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.811694 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.811760 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.811782 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.811811 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.811833 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:20Z","lastTransitionTime":"2025-11-26T22:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.813664 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pnnz\" (UniqueName: \"kubernetes.io/projected/05fe79b6-3204-4cbc-b84f-2f700281ab05-kube-api-access-9pnnz\") pod \"ovnkube-control-plane-749d76644c-vwnrv\" (UID: \"05fe79b6-3204-4cbc-b84f-2f700281ab05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.813744 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05fe79b6-3204-4cbc-b84f-2f700281ab05-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vwnrv\" (UID: \"05fe79b6-3204-4cbc-b84f-2f700281ab05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.813791 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05fe79b6-3204-4cbc-b84f-2f700281ab05-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vwnrv\" (UID: \"05fe79b6-3204-4cbc-b84f-2f700281ab05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.813825 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05fe79b6-3204-4cbc-b84f-2f700281ab05-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vwnrv\" (UID: \"05fe79b6-3204-4cbc-b84f-2f700281ab05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.823909 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.825822 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/1.log" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.826555 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/0.log" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.829178 5008 generic.go:334] "Generic (PLEG): container finished" podID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerID="c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded" exitCode=1 Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.829216 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded"} Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.829245 5008 scope.go:117] "RemoveContainer" containerID="b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.830022 5008 scope.go:117] "RemoveContainer" containerID="c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded" Nov 26 22:39:20 crc kubenswrapper[5008]: E1126 22:39:20.830163 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.844661 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.860231 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.870489 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.890427 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:19Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.377351 6327 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1126 22:39:19.377426 6327 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:19.377452 6327 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:19.377489 6327 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:19.377486 6327 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1126 22:39:19.377555 6327 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.378049 6327 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:19.378571 6327 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:19.378619 6327 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1126 22:39:19.378645 6327 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:19.378654 6327 factory.go:656] Stopping watch factory\\\\nI1126 22:39:19.378679 6327 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.902087 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.914021 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.914071 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.914082 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.914104 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.914117 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:20Z","lastTransitionTime":"2025-11-26T22:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.914329 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pnnz\" (UniqueName: \"kubernetes.io/projected/05fe79b6-3204-4cbc-b84f-2f700281ab05-kube-api-access-9pnnz\") pod \"ovnkube-control-plane-749d76644c-vwnrv\" (UID: \"05fe79b6-3204-4cbc-b84f-2f700281ab05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.914391 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05fe79b6-3204-4cbc-b84f-2f700281ab05-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vwnrv\" (UID: \"05fe79b6-3204-4cbc-b84f-2f700281ab05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.914426 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05fe79b6-3204-4cbc-b84f-2f700281ab05-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vwnrv\" (UID: \"05fe79b6-3204-4cbc-b84f-2f700281ab05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.914447 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05fe79b6-3204-4cbc-b84f-2f700281ab05-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vwnrv\" (UID: \"05fe79b6-3204-4cbc-b84f-2f700281ab05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.914599 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.914998 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05fe79b6-3204-4cbc-b84f-2f700281ab05-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vwnrv\" (UID: \"05fe79b6-3204-4cbc-b84f-2f700281ab05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.915224 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05fe79b6-3204-4cbc-b84f-2f700281ab05-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vwnrv\" (UID: \"05fe79b6-3204-4cbc-b84f-2f700281ab05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.922612 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05fe79b6-3204-4cbc-b84f-2f700281ab05-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vwnrv\" (UID: \"05fe79b6-3204-4cbc-b84f-2f700281ab05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.927626 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.931316 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pnnz\" (UniqueName: \"kubernetes.io/projected/05fe79b6-3204-4cbc-b84f-2f700281ab05-kube-api-access-9pnnz\") pod \"ovnkube-control-plane-749d76644c-vwnrv\" (UID: \"05fe79b6-3204-4cbc-b84f-2f700281ab05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.938955 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.950464 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.967106 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.980956 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:20 crc kubenswrapper[5008]: I1126 22:39:20.993389 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:20Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.007333 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.016878 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.016935 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.016949 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.016996 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.017015 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:21Z","lastTransitionTime":"2025-11-26T22:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.019833 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.033261 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.049379 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.050577 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: W1126 22:39:21.063512 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05fe79b6_3204_4cbc_b84f_2f700281ab05.slice/crio-a564f7ab4ad703de43f21707e5836a3322c7dfe3db3e23fe3b88b7139883bb1c WatchSource:0}: Error finding container a564f7ab4ad703de43f21707e5836a3322c7dfe3db3e23fe3b88b7139883bb1c: Status 404 returned error can't find the container with id a564f7ab4ad703de43f21707e5836a3322c7dfe3db3e23fe3b88b7139883bb1c Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.069683 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.093558 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:19Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.377351 6327 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1126 22:39:19.377426 6327 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:19.377452 6327 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:19.377489 6327 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:19.377486 6327 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1126 22:39:19.377555 6327 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.378049 6327 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:19.378571 6327 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:19.378619 6327 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1126 22:39:19.378645 6327 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:19.378654 6327 factory.go:656] Stopping watch factory\\\\nI1126 22:39:19.378679 6327 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"Namespace event handler 1 for removal\\\\nI1126 22:39:20.777606 6466 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 22:39:20.777621 6466 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 22:39:20.777641 6466 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:20.777668 6466 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:20.777697 6466 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 22:39:20.777736 6466 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:20.777750 6466 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:20.777765 6466 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 22:39:20.777776 6466 factory.go:656] Stopping watch factory\\\\nI1126 22:39:20.777807 6466 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:20.777820 6466 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:20.777832 6466 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:20.777895 6466 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 22:39:20.777941 6466 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:20.777995 6466 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:20.778068 6466 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.107422 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.119539 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.119617 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.119632 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.119652 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.119666 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:21Z","lastTransitionTime":"2025-11-26T22:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.132856 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.145203 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.157824 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.170788 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.180693 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.195360 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.210003 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.222135 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.222180 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.222192 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.222211 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.222224 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:21Z","lastTransitionTime":"2025-11-26T22:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.324865 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.324897 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.324906 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.324920 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.324928 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:21Z","lastTransitionTime":"2025-11-26T22:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.428262 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.428306 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.428321 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.428340 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.428352 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:21Z","lastTransitionTime":"2025-11-26T22:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.530615 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.530661 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.530673 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.530692 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.530704 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:21Z","lastTransitionTime":"2025-11-26T22:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.633479 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.633543 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.633560 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.633585 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.633604 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:21Z","lastTransitionTime":"2025-11-26T22:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.736875 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.736950 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.737008 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.737033 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.737052 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:21Z","lastTransitionTime":"2025-11-26T22:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.836881 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/1.log" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.839292 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.839370 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.839395 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.839424 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.839446 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:21Z","lastTransitionTime":"2025-11-26T22:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.844401 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" event={"ID":"05fe79b6-3204-4cbc-b84f-2f700281ab05","Type":"ContainerStarted","Data":"5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7"} Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.844461 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" event={"ID":"05fe79b6-3204-4cbc-b84f-2f700281ab05","Type":"ContainerStarted","Data":"94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207"} Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.844488 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" event={"ID":"05fe79b6-3204-4cbc-b84f-2f700281ab05","Type":"ContainerStarted","Data":"a564f7ab4ad703de43f21707e5836a3322c7dfe3db3e23fe3b88b7139883bb1c"} Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.861208 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-xplkg"] Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.861769 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:21 crc kubenswrapper[5008]: E1126 22:39:21.861849 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.867103 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.882215 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.914261 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:19Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.377351 6327 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1126 22:39:19.377426 6327 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:19.377452 6327 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:19.377489 6327 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:19.377486 6327 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1126 22:39:19.377555 6327 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.378049 6327 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:19.378571 6327 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:19.378619 6327 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1126 22:39:19.378645 6327 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:19.378654 6327 factory.go:656] Stopping watch factory\\\\nI1126 22:39:19.378679 6327 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"Namespace event handler 1 for removal\\\\nI1126 22:39:20.777606 6466 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 22:39:20.777621 6466 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 22:39:20.777641 6466 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:20.777668 6466 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:20.777697 6466 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 22:39:20.777736 6466 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:20.777750 6466 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:20.777765 6466 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 22:39:20.777776 6466 factory.go:656] Stopping watch factory\\\\nI1126 22:39:20.777807 6466 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:20.777820 6466 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:20.777832 6466 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:20.777895 6466 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 22:39:20.777941 6466 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:20.777995 6466 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:20.778068 6466 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.924362 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.924652 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc88z\" (UniqueName: \"kubernetes.io/projected/7adf9a69-5de6-4710-b394-968387df9ae6-kube-api-access-tc88z\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.934992 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.943363 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.943432 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.943454 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.943488 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.943511 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:21Z","lastTransitionTime":"2025-11-26T22:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.956189 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.975551 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:21 crc kubenswrapper[5008]: I1126 22:39:21.994847 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:21Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.015524 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.025199 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc88z\" (UniqueName: \"kubernetes.io/projected/7adf9a69-5de6-4710-b394-968387df9ae6-kube-api-access-tc88z\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.025249 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.025426 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.025498 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs podName:7adf9a69-5de6-4710-b394-968387df9ae6 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:22.525479289 +0000 UTC m=+37.938173301 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs") pod "network-metrics-daemon-xplkg" (UID: "7adf9a69-5de6-4710-b394-968387df9ae6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.040072 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.046030 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.046083 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.046101 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.046124 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.046144 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:22Z","lastTransitionTime":"2025-11-26T22:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.056411 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.056537 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc88z\" (UniqueName: \"kubernetes.io/projected/7adf9a69-5de6-4710-b394-968387df9ae6-kube-api-access-tc88z\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.074822 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.089519 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.108433 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.133391 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.148986 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.149037 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.149054 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.149075 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.149089 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:22Z","lastTransitionTime":"2025-11-26T22:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.157067 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.178649 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.196195 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.212904 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.227143 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.227371 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:39:38.227333967 +0000 UTC m=+53.640027989 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.227517 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.227703 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.227826 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:38.227803072 +0000 UTC m=+53.640497114 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.231545 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.245668 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.251527 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.251632 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.251650 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.251671 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.251686 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:22Z","lastTransitionTime":"2025-11-26T22:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.266008 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.284845 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.303117 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.315406 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.328103 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.328189 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.328230 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.328274 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.328308 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.328316 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.328347 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.328360 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.328374 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.328320 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.328433 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:38.328413261 +0000 UTC m=+53.741107273 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.328455 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:38.328446532 +0000 UTC m=+53.741140544 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.328469 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:38.328462202 +0000 UTC m=+53.741156224 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.341389 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:19Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.377351 6327 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1126 22:39:19.377426 6327 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:19.377452 6327 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:19.377489 6327 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:19.377486 6327 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1126 22:39:19.377555 6327 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.378049 6327 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:19.378571 6327 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:19.378619 6327 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1126 22:39:19.378645 6327 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:19.378654 6327 factory.go:656] Stopping watch factory\\\\nI1126 22:39:19.378679 6327 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"Namespace event handler 1 for removal\\\\nI1126 22:39:20.777606 6466 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 22:39:20.777621 6466 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 22:39:20.777641 6466 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:20.777668 6466 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:20.777697 6466 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 22:39:20.777736 6466 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:20.777750 6466 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:20.777765 6466 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 22:39:20.777776 6466 factory.go:656] Stopping watch factory\\\\nI1126 22:39:20.777807 6466 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:20.777820 6466 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:20.777832 6466 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:20.777895 6466 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 22:39:20.777941 6466 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:20.777995 6466 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:20.778068 6466 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.355852 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.355907 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.355916 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.355929 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.355940 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:22Z","lastTransitionTime":"2025-11-26T22:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.357805 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.368591 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.378824 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.394485 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.405854 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.420655 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.449525 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.458258 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.458320 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.458332 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.458348 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.458358 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:22Z","lastTransitionTime":"2025-11-26T22:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.460645 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.518086 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.518181 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.518363 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.518420 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.518622 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.518803 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.530192 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.530293 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.530579 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs podName:7adf9a69-5de6-4710-b394-968387df9ae6 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:23.530385561 +0000 UTC m=+38.943079593 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs") pod "network-metrics-daemon-xplkg" (UID: "7adf9a69-5de6-4710-b394-968387df9ae6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.560905 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.560937 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.560948 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.560983 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.560995 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:22Z","lastTransitionTime":"2025-11-26T22:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.664496 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.664567 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.664619 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.664654 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.664678 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:22Z","lastTransitionTime":"2025-11-26T22:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.767550 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.767628 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.767647 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.767671 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.767692 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:22Z","lastTransitionTime":"2025-11-26T22:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.870281 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.870347 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.870365 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.870392 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.870413 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:22Z","lastTransitionTime":"2025-11-26T22:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.885863 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.885929 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.885952 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.886019 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.886041 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:22Z","lastTransitionTime":"2025-11-26T22:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.907083 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.912139 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.912200 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.912218 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.912244 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.912261 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:22Z","lastTransitionTime":"2025-11-26T22:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.933108 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.938034 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.938100 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.938119 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.938145 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.938165 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:22Z","lastTransitionTime":"2025-11-26T22:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.964234 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.970402 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.970469 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.970489 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.970518 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:22 crc kubenswrapper[5008]: I1126 22:39:22.970537 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:22Z","lastTransitionTime":"2025-11-26T22:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:22 crc kubenswrapper[5008]: E1126 22:39:22.997061 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:22Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.001738 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.001785 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.001795 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.001810 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.001821 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:23Z","lastTransitionTime":"2025-11-26T22:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:23 crc kubenswrapper[5008]: E1126 22:39:23.016284 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: E1126 22:39:23.016553 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.018869 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.018922 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.018939 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.019021 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.019045 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:23Z","lastTransitionTime":"2025-11-26T22:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.119321 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.121254 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.121313 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.121332 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.121359 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.121378 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:23Z","lastTransitionTime":"2025-11-26T22:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.145669 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.166322 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.186419 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.205174 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.224021 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.224064 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.224075 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.224093 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.224109 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:23Z","lastTransitionTime":"2025-11-26T22:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.238123 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:19Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.377351 6327 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1126 22:39:19.377426 6327 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:19.377452 6327 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:19.377489 6327 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:19.377486 6327 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1126 22:39:19.377555 6327 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.378049 6327 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:19.378571 6327 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:19.378619 6327 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1126 22:39:19.378645 6327 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:19.378654 6327 factory.go:656] Stopping watch factory\\\\nI1126 22:39:19.378679 6327 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"Namespace event handler 1 for removal\\\\nI1126 22:39:20.777606 6466 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 22:39:20.777621 6466 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 22:39:20.777641 6466 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:20.777668 6466 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:20.777697 6466 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 22:39:20.777736 6466 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:20.777750 6466 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:20.777765 6466 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 22:39:20.777776 6466 factory.go:656] Stopping watch factory\\\\nI1126 22:39:20.777807 6466 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:20.777820 6466 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:20.777832 6466 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:20.777895 6466 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 22:39:20.777941 6466 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:20.777995 6466 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:20.778068 6466 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.260091 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.274027 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.291101 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.307989 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.326512 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.326549 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.326565 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.326586 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.326601 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:23Z","lastTransitionTime":"2025-11-26T22:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.326674 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.344871 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.378103 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.399621 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.415803 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.469030 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.471183 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.471448 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.471644 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.471872 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.472137 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:23Z","lastTransitionTime":"2025-11-26T22:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.489241 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.505918 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:23Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.518553 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:23 crc kubenswrapper[5008]: E1126 22:39:23.518718 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.571074 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:23 crc kubenswrapper[5008]: E1126 22:39:23.571317 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:23 crc kubenswrapper[5008]: E1126 22:39:23.571480 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs podName:7adf9a69-5de6-4710-b394-968387df9ae6 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:25.571447613 +0000 UTC m=+40.984141705 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs") pod "network-metrics-daemon-xplkg" (UID: "7adf9a69-5de6-4710-b394-968387df9ae6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.575193 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.575437 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.575598 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.575814 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.576085 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:23Z","lastTransitionTime":"2025-11-26T22:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.679479 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.679529 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.679546 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.679568 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.679584 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:23Z","lastTransitionTime":"2025-11-26T22:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.783124 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.783197 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.783219 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.783246 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.783265 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:23Z","lastTransitionTime":"2025-11-26T22:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.886465 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.886532 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.886557 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.886586 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.886613 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:23Z","lastTransitionTime":"2025-11-26T22:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.989404 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.989639 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.989656 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.989678 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:23 crc kubenswrapper[5008]: I1126 22:39:23.989696 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:23Z","lastTransitionTime":"2025-11-26T22:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.093308 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.093380 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.093397 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.093426 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.093449 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:24Z","lastTransitionTime":"2025-11-26T22:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.196036 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.196073 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.196085 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.196100 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.196111 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:24Z","lastTransitionTime":"2025-11-26T22:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.299461 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.299525 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.299548 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.299595 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.299621 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:24Z","lastTransitionTime":"2025-11-26T22:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.402580 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.402631 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.402648 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.402671 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.402688 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:24Z","lastTransitionTime":"2025-11-26T22:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.505683 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.505769 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.505799 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.505825 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.505842 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:24Z","lastTransitionTime":"2025-11-26T22:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.518224 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.518295 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.518341 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:24 crc kubenswrapper[5008]: E1126 22:39:24.518592 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:24 crc kubenswrapper[5008]: E1126 22:39:24.519102 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:24 crc kubenswrapper[5008]: E1126 22:39:24.519321 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.608918 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.609017 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.609037 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.609063 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.609080 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:24Z","lastTransitionTime":"2025-11-26T22:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.712468 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.712539 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.712565 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.712594 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.712616 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:24Z","lastTransitionTime":"2025-11-26T22:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.815404 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.815475 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.815498 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.815527 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.815549 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:24Z","lastTransitionTime":"2025-11-26T22:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.918241 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.918296 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.918312 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.918331 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:24 crc kubenswrapper[5008]: I1126 22:39:24.918347 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:24Z","lastTransitionTime":"2025-11-26T22:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.020985 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.021041 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.021057 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.021084 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.021099 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:25Z","lastTransitionTime":"2025-11-26T22:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.123933 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.124051 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.124070 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.124093 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.124113 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:25Z","lastTransitionTime":"2025-11-26T22:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.226646 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.226689 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.226705 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.226727 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.226742 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:25Z","lastTransitionTime":"2025-11-26T22:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.329107 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.329138 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.329146 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.329158 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.329167 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:25Z","lastTransitionTime":"2025-11-26T22:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.431731 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.431794 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.431817 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.431841 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.431857 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:25Z","lastTransitionTime":"2025-11-26T22:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.518147 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:25 crc kubenswrapper[5008]: E1126 22:39:25.518400 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.534170 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.534205 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.534216 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.534232 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.534243 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:25Z","lastTransitionTime":"2025-11-26T22:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.539182 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.554025 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.567159 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.580487 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.593220 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:25 crc kubenswrapper[5008]: E1126 22:39:25.593446 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:25 crc kubenswrapper[5008]: E1126 22:39:25.593563 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs podName:7adf9a69-5de6-4710-b394-968387df9ae6 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:29.593530583 +0000 UTC m=+45.006224665 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs") pod "network-metrics-daemon-xplkg" (UID: "7adf9a69-5de6-4710-b394-968387df9ae6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.604452 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6a231efa72725ab8997ee7949f1c1f89001ed2200fe2ee01fa7b77a74edd161\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:19Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.377351 6327 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1126 22:39:19.377426 6327 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:19.377452 6327 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:19.377489 6327 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:19.377486 6327 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1126 22:39:19.377555 6327 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 22:39:19.378049 6327 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:19.378571 6327 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:19.378619 6327 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1126 22:39:19.378645 6327 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:19.378654 6327 factory.go:656] Stopping watch factory\\\\nI1126 22:39:19.378679 6327 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"Namespace event handler 1 for removal\\\\nI1126 22:39:20.777606 6466 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 22:39:20.777621 6466 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 22:39:20.777641 6466 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:20.777668 6466 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:20.777697 6466 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 22:39:20.777736 6466 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:20.777750 6466 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:20.777765 6466 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 22:39:20.777776 6466 factory.go:656] Stopping watch factory\\\\nI1126 22:39:20.777807 6466 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:20.777820 6466 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:20.777832 6466 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:20.777895 6466 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 22:39:20.777941 6466 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:20.777995 6466 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:20.778068 6466 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.622592 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.633481 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.635829 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.635869 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.635885 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.635907 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.635925 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:25Z","lastTransitionTime":"2025-11-26T22:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.656923 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.673383 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.684082 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.696844 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.704816 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.713833 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.725592 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.737642 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.738316 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.738366 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.738378 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.738394 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.738405 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:25Z","lastTransitionTime":"2025-11-26T22:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.748463 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.761334 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:25Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.840273 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.840319 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.840330 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.840347 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.840358 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:25Z","lastTransitionTime":"2025-11-26T22:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.942562 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.942613 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.942631 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.942652 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:25 crc kubenswrapper[5008]: I1126 22:39:25.942668 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:25Z","lastTransitionTime":"2025-11-26T22:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.046037 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.046110 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.046135 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.046166 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.046189 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:26Z","lastTransitionTime":"2025-11-26T22:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.149827 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.149899 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.149917 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.149944 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.150022 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:26Z","lastTransitionTime":"2025-11-26T22:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.252873 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.252916 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.252926 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.252940 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.252949 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:26Z","lastTransitionTime":"2025-11-26T22:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.356052 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.356111 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.356128 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.356150 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.356166 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:26Z","lastTransitionTime":"2025-11-26T22:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.459385 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.459457 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.459492 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.459522 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.459543 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:26Z","lastTransitionTime":"2025-11-26T22:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.517647 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.517763 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:26 crc kubenswrapper[5008]: E1126 22:39:26.517835 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.517647 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:26 crc kubenswrapper[5008]: E1126 22:39:26.518042 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:26 crc kubenswrapper[5008]: E1126 22:39:26.518096 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.563535 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.563596 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.563613 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.563635 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.563657 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:26Z","lastTransitionTime":"2025-11-26T22:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.666678 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.666733 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.666750 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.666775 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.666792 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:26Z","lastTransitionTime":"2025-11-26T22:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.770218 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.770288 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.770306 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.770328 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.770344 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:26Z","lastTransitionTime":"2025-11-26T22:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.873570 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.873695 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.873714 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.873739 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.873757 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:26Z","lastTransitionTime":"2025-11-26T22:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.977333 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.977397 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.977411 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.977436 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:26 crc kubenswrapper[5008]: I1126 22:39:26.977520 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:26Z","lastTransitionTime":"2025-11-26T22:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.081227 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.081311 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.081340 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.081370 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.081390 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:27Z","lastTransitionTime":"2025-11-26T22:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.184071 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.184134 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.184152 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.184177 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.184194 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:27Z","lastTransitionTime":"2025-11-26T22:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.286878 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.286946 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.287005 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.287038 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.287068 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:27Z","lastTransitionTime":"2025-11-26T22:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.389543 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.389575 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.389586 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.389604 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.389615 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:27Z","lastTransitionTime":"2025-11-26T22:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.492535 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.492593 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.492611 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.492633 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.492650 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:27Z","lastTransitionTime":"2025-11-26T22:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.518261 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:27 crc kubenswrapper[5008]: E1126 22:39:27.518431 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.596247 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.596316 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.596338 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.596366 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.596384 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:27Z","lastTransitionTime":"2025-11-26T22:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.699062 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.699127 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.699145 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.699171 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.699188 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:27Z","lastTransitionTime":"2025-11-26T22:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.802479 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.802536 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.802554 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.802580 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.802601 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:27Z","lastTransitionTime":"2025-11-26T22:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.905817 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.905881 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.905940 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.906006 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:27 crc kubenswrapper[5008]: I1126 22:39:27.906026 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:27Z","lastTransitionTime":"2025-11-26T22:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.008947 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.009016 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.009028 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.009052 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.009066 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:28Z","lastTransitionTime":"2025-11-26T22:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.112167 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.112248 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.112273 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.112299 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.112316 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:28Z","lastTransitionTime":"2025-11-26T22:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.153635 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.154837 5008 scope.go:117] "RemoveContainer" containerID="c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded" Nov 26 22:39:28 crc kubenswrapper[5008]: E1126 22:39:28.155111 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.175155 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.209242 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"Namespace event handler 1 for removal\\\\nI1126 22:39:20.777606 6466 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 22:39:20.777621 6466 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 22:39:20.777641 6466 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:20.777668 6466 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:20.777697 6466 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 22:39:20.777736 6466 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:20.777750 6466 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:20.777765 6466 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 22:39:20.777776 6466 factory.go:656] Stopping watch factory\\\\nI1126 22:39:20.777807 6466 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:20.777820 6466 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:20.777832 6466 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:20.777895 6466 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 22:39:20.777941 6466 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:20.777995 6466 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:20.778068 6466 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.214915 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.215013 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.215039 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.215068 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.215093 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:28Z","lastTransitionTime":"2025-11-26T22:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.233144 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.251767 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.273815 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.294508 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.318243 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.318297 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.318314 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.318338 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.318358 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:28Z","lastTransitionTime":"2025-11-26T22:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.319010 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.339493 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.371609 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.391301 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.406431 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.421718 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.421781 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.421801 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.421826 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.421846 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:28Z","lastTransitionTime":"2025-11-26T22:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.429703 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.449065 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.466828 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.484021 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.500716 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.517753 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.517850 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.517903 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:28 crc kubenswrapper[5008]: E1126 22:39:28.518140 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:28 crc kubenswrapper[5008]: E1126 22:39:28.518255 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:28 crc kubenswrapper[5008]: E1126 22:39:28.518364 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.522134 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:28Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.524865 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.524944 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.525001 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.525039 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.525061 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:28Z","lastTransitionTime":"2025-11-26T22:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.628653 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.628726 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.628743 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.628767 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.628783 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:28Z","lastTransitionTime":"2025-11-26T22:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.731694 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.731739 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.731747 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.731761 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.731771 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:28Z","lastTransitionTime":"2025-11-26T22:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.835578 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.835642 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.835665 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.835690 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.835705 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:28Z","lastTransitionTime":"2025-11-26T22:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.939671 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.939734 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.939755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.939781 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:28 crc kubenswrapper[5008]: I1126 22:39:28.939799 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:28Z","lastTransitionTime":"2025-11-26T22:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.043089 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.043162 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.043184 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.043212 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.043235 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:29Z","lastTransitionTime":"2025-11-26T22:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.146483 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.146902 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.147107 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.147309 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.147486 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:29Z","lastTransitionTime":"2025-11-26T22:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.252398 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.252799 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.253013 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.253192 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.253394 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:29Z","lastTransitionTime":"2025-11-26T22:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.357220 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.357281 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.357300 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.357325 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.357345 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:29Z","lastTransitionTime":"2025-11-26T22:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.462156 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.462230 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.462248 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.462273 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.462291 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:29Z","lastTransitionTime":"2025-11-26T22:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.518412 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:29 crc kubenswrapper[5008]: E1126 22:39:29.518700 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.565815 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.565868 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.565885 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.565906 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.565921 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:29Z","lastTransitionTime":"2025-11-26T22:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.668284 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:29 crc kubenswrapper[5008]: E1126 22:39:29.668471 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:29 crc kubenswrapper[5008]: E1126 22:39:29.668845 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs podName:7adf9a69-5de6-4710-b394-968387df9ae6 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:37.668823274 +0000 UTC m=+53.081517286 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs") pod "network-metrics-daemon-xplkg" (UID: "7adf9a69-5de6-4710-b394-968387df9ae6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.668567 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.668921 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.668944 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.669002 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.669030 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:29Z","lastTransitionTime":"2025-11-26T22:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.771678 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.771746 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.771768 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.771795 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.771816 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:29Z","lastTransitionTime":"2025-11-26T22:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.874651 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.874720 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.874737 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.874758 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.874774 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:29Z","lastTransitionTime":"2025-11-26T22:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.977698 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.977740 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.977751 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.977769 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:29 crc kubenswrapper[5008]: I1126 22:39:29.977784 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:29Z","lastTransitionTime":"2025-11-26T22:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.080447 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.080500 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.080526 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.080555 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.080578 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:30Z","lastTransitionTime":"2025-11-26T22:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.183260 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.183317 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.183339 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.183370 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.183390 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:30Z","lastTransitionTime":"2025-11-26T22:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.285868 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.285919 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.285937 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.285961 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.286017 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:30Z","lastTransitionTime":"2025-11-26T22:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.389115 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.389176 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.389193 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.389218 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.389235 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:30Z","lastTransitionTime":"2025-11-26T22:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.491135 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.491207 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.491224 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.491249 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.491298 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:30Z","lastTransitionTime":"2025-11-26T22:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.517415 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:30 crc kubenswrapper[5008]: E1126 22:39:30.517590 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.517794 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:30 crc kubenswrapper[5008]: E1126 22:39:30.518051 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.518065 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:30 crc kubenswrapper[5008]: E1126 22:39:30.518206 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.594327 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.594372 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.594383 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.594401 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.594412 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:30Z","lastTransitionTime":"2025-11-26T22:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.697599 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.698068 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.698277 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.698445 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.698599 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:30Z","lastTransitionTime":"2025-11-26T22:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.801571 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.801658 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.801678 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.801706 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.801727 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:30Z","lastTransitionTime":"2025-11-26T22:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.904595 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.904651 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.904677 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.904704 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:30 crc kubenswrapper[5008]: I1126 22:39:30.904726 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:30Z","lastTransitionTime":"2025-11-26T22:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.007387 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.007451 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.007471 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.007495 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.007511 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:31Z","lastTransitionTime":"2025-11-26T22:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.110314 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.110375 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.110393 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.110417 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.110435 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:31Z","lastTransitionTime":"2025-11-26T22:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.214565 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.214639 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.214660 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.214686 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.214708 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:31Z","lastTransitionTime":"2025-11-26T22:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.317873 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.317955 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.318021 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.318053 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.318074 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:31Z","lastTransitionTime":"2025-11-26T22:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.421122 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.421181 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.421201 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.421226 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.421243 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:31Z","lastTransitionTime":"2025-11-26T22:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.517538 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:31 crc kubenswrapper[5008]: E1126 22:39:31.517740 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.524789 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.524937 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.525032 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.525142 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.525169 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:31Z","lastTransitionTime":"2025-11-26T22:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.628865 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.629021 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.629043 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.629070 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.629088 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:31Z","lastTransitionTime":"2025-11-26T22:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.733480 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.733565 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.733588 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.733621 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.733653 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:31Z","lastTransitionTime":"2025-11-26T22:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.837498 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.837546 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.837565 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.837590 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.837608 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:31Z","lastTransitionTime":"2025-11-26T22:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.941447 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.942134 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.942357 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.942578 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:31 crc kubenswrapper[5008]: I1126 22:39:31.942786 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:31Z","lastTransitionTime":"2025-11-26T22:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.046771 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.046838 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.046854 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.046883 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.046900 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:32Z","lastTransitionTime":"2025-11-26T22:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.150122 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.150201 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.150215 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.150245 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.150274 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:32Z","lastTransitionTime":"2025-11-26T22:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.253612 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.253696 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.253719 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.253751 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.253774 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:32Z","lastTransitionTime":"2025-11-26T22:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.357920 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.358032 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.358052 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.358076 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.358093 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:32Z","lastTransitionTime":"2025-11-26T22:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.461518 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.461587 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.461605 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.461629 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.461651 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:32Z","lastTransitionTime":"2025-11-26T22:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.518184 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.518214 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.518336 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:32 crc kubenswrapper[5008]: E1126 22:39:32.518631 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:32 crc kubenswrapper[5008]: E1126 22:39:32.518759 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:32 crc kubenswrapper[5008]: E1126 22:39:32.518944 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.564772 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.564842 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.564854 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.564876 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.564889 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:32Z","lastTransitionTime":"2025-11-26T22:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.668076 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.668146 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.668170 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.668197 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.668215 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:32Z","lastTransitionTime":"2025-11-26T22:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.771444 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.771499 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.771520 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.771545 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.771563 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:32Z","lastTransitionTime":"2025-11-26T22:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.875453 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.875541 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.875604 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.875635 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.876264 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:32Z","lastTransitionTime":"2025-11-26T22:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.980373 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.980471 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.980494 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.980534 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:32 crc kubenswrapper[5008]: I1126 22:39:32.980563 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:32Z","lastTransitionTime":"2025-11-26T22:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.084079 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.084161 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.084175 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.084202 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.084219 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.187546 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.187629 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.187656 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.187683 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.187700 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.291564 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.291638 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.291656 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.291680 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.291698 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.316383 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.316453 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.316476 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.316510 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.316532 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: E1126 22:39:33.337104 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:33Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.342670 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.342784 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.342811 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.342845 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.342868 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: E1126 22:39:33.363026 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:33Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.368028 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.368077 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.368094 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.368120 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.368140 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: E1126 22:39:33.390131 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:33Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.394573 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.394642 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.394666 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.394721 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.394740 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: E1126 22:39:33.416505 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:33Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.421433 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.421489 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.421508 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.421536 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.421554 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: E1126 22:39:33.441136 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:33Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:33 crc kubenswrapper[5008]: E1126 22:39:33.441423 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.443996 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.444033 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.444053 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.444072 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.444085 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.518037 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:33 crc kubenswrapper[5008]: E1126 22:39:33.518250 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.547186 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.547262 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.547286 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.547315 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.547335 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.650839 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.650935 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.650960 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.651025 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.651053 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.753712 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.753774 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.753798 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.753826 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.753850 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.857493 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.857646 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.857673 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.857702 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.857723 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.961127 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.961192 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.961230 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.961260 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:33 crc kubenswrapper[5008]: I1126 22:39:33.961280 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:33Z","lastTransitionTime":"2025-11-26T22:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.064552 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.064658 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.064678 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.064711 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.064735 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:34Z","lastTransitionTime":"2025-11-26T22:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.167824 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.167954 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.168003 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.168028 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.168045 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:34Z","lastTransitionTime":"2025-11-26T22:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.271913 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.272007 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.272026 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.272051 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.272068 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:34Z","lastTransitionTime":"2025-11-26T22:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.376067 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.376125 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.376141 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.376165 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.376183 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:34Z","lastTransitionTime":"2025-11-26T22:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.478882 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.479011 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.479029 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.479053 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.479070 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:34Z","lastTransitionTime":"2025-11-26T22:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.517880 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.517887 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:34 crc kubenswrapper[5008]: E1126 22:39:34.518160 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:34 crc kubenswrapper[5008]: E1126 22:39:34.518283 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.517910 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:34 crc kubenswrapper[5008]: E1126 22:39:34.518760 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.582142 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.582210 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.582229 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.582255 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.582274 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:34Z","lastTransitionTime":"2025-11-26T22:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.685015 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.685083 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.685099 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.685124 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.685144 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:34Z","lastTransitionTime":"2025-11-26T22:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.789154 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.789212 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.789230 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.789253 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.789271 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:34Z","lastTransitionTime":"2025-11-26T22:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.892831 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.893034 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.893059 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.893100 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.893121 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:34Z","lastTransitionTime":"2025-11-26T22:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.995918 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.996095 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.996126 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.996154 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:34 crc kubenswrapper[5008]: I1126 22:39:34.996173 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:34Z","lastTransitionTime":"2025-11-26T22:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.099682 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.099759 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.099784 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.099857 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.099895 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:35Z","lastTransitionTime":"2025-11-26T22:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.203384 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.203441 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.203459 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.203482 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.203501 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:35Z","lastTransitionTime":"2025-11-26T22:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.306358 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.306425 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.306465 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.306496 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.306515 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:35Z","lastTransitionTime":"2025-11-26T22:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.409484 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.409595 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.409613 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.409643 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.409662 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:35Z","lastTransitionTime":"2025-11-26T22:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.513361 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.513421 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.513438 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.513459 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.513474 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:35Z","lastTransitionTime":"2025-11-26T22:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.517831 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:35 crc kubenswrapper[5008]: E1126 22:39:35.518085 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.542476 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.562338 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.595815 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.614863 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.617344 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.617405 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.617424 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.617449 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.617469 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:35Z","lastTransitionTime":"2025-11-26T22:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.634241 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.655292 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.674411 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.700263 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.720604 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.720666 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.720689 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.720715 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.720776 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:35Z","lastTransitionTime":"2025-11-26T22:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.724499 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.749904 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.773249 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.796788 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.822241 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"Namespace event handler 1 for removal\\\\nI1126 22:39:20.777606 6466 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 22:39:20.777621 6466 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 22:39:20.777641 6466 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:20.777668 6466 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:20.777697 6466 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 22:39:20.777736 6466 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:20.777750 6466 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:20.777765 6466 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 22:39:20.777776 6466 factory.go:656] Stopping watch factory\\\\nI1126 22:39:20.777807 6466 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:20.777820 6466 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:20.777832 6466 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:20.777895 6466 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 22:39:20.777941 6466 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:20.777995 6466 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:20.778068 6466 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.824205 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.824262 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.824282 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.824305 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.824322 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:35Z","lastTransitionTime":"2025-11-26T22:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.845707 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.859334 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.880685 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.900668 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:35Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.926836 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.926878 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.926890 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.926907 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:35 crc kubenswrapper[5008]: I1126 22:39:35.926924 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:35Z","lastTransitionTime":"2025-11-26T22:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.029570 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.029611 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.029622 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.029635 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.029643 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:36Z","lastTransitionTime":"2025-11-26T22:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.132261 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.132307 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.132323 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.132349 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.132366 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:36Z","lastTransitionTime":"2025-11-26T22:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.235657 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.235704 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.235721 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.235747 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.235771 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:36Z","lastTransitionTime":"2025-11-26T22:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.339214 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.339338 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.339360 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.339388 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.339409 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:36Z","lastTransitionTime":"2025-11-26T22:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.442253 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.442321 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.442337 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.442361 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.442378 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:36Z","lastTransitionTime":"2025-11-26T22:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.517641 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.517742 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:36 crc kubenswrapper[5008]: E1126 22:39:36.517794 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.517850 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:36 crc kubenswrapper[5008]: E1126 22:39:36.517951 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:36 crc kubenswrapper[5008]: E1126 22:39:36.518086 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.545926 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.546030 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.546047 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.546071 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.546088 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:36Z","lastTransitionTime":"2025-11-26T22:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.649082 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.649147 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.649165 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.649189 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.649206 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:36Z","lastTransitionTime":"2025-11-26T22:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.751627 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.751786 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.751807 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.751833 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.751851 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:36Z","lastTransitionTime":"2025-11-26T22:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.855226 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.855291 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.855312 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.855343 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.855365 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:36Z","lastTransitionTime":"2025-11-26T22:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.958303 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.958414 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.958478 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.958512 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:36 crc kubenswrapper[5008]: I1126 22:39:36.958529 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:36Z","lastTransitionTime":"2025-11-26T22:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.061936 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.062023 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.062043 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.062066 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.062084 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:37Z","lastTransitionTime":"2025-11-26T22:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.165138 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.165272 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.165294 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.165319 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.165337 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:37Z","lastTransitionTime":"2025-11-26T22:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.267758 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.267858 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.267876 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.267939 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.267999 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:37Z","lastTransitionTime":"2025-11-26T22:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.371142 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.371203 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.371219 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.371243 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.371264 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:37Z","lastTransitionTime":"2025-11-26T22:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.474311 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.474365 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.474384 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.474407 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.474424 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:37Z","lastTransitionTime":"2025-11-26T22:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.518017 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:37 crc kubenswrapper[5008]: E1126 22:39:37.518269 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.577430 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.577487 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.577503 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.577528 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.577545 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:37Z","lastTransitionTime":"2025-11-26T22:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.680613 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.680682 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.680703 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.680729 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.680749 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:37Z","lastTransitionTime":"2025-11-26T22:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.760038 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:37 crc kubenswrapper[5008]: E1126 22:39:37.760271 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:37 crc kubenswrapper[5008]: E1126 22:39:37.760387 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs podName:7adf9a69-5de6-4710-b394-968387df9ae6 nodeName:}" failed. No retries permitted until 2025-11-26 22:39:53.760359105 +0000 UTC m=+69.173053147 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs") pod "network-metrics-daemon-xplkg" (UID: "7adf9a69-5de6-4710-b394-968387df9ae6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.784046 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.784127 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.784150 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.784180 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.784203 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:37Z","lastTransitionTime":"2025-11-26T22:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.887886 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.887952 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.888009 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.888037 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.888060 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:37Z","lastTransitionTime":"2025-11-26T22:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.990994 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.991045 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.991062 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.991084 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:37 crc kubenswrapper[5008]: I1126 22:39:37.991098 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:37Z","lastTransitionTime":"2025-11-26T22:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.094733 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.094792 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.094814 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.094837 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.094856 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:38Z","lastTransitionTime":"2025-11-26T22:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.198401 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.198462 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.198490 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.198514 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.198533 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:38Z","lastTransitionTime":"2025-11-26T22:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.267625 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.267815 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:40:10.267775368 +0000 UTC m=+85.680469410 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.268070 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.268204 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.268290 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:40:10.268274254 +0000 UTC m=+85.680968286 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.300897 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.300940 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.300959 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.301009 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.301026 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:38Z","lastTransitionTime":"2025-11-26T22:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.369274 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.369435 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.369510 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.369683 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.369840 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:40:10.369808362 +0000 UTC m=+85.782502404 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.370141 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.370333 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.370489 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.370680 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 22:40:10.370652498 +0000 UTC m=+85.783346540 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.370142 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.370942 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.371121 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.371302 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 22:40:10.371284528 +0000 UTC m=+85.783978560 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.404114 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.404171 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.404188 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.404211 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.404231 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:38Z","lastTransitionTime":"2025-11-26T22:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.507528 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.507584 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.507609 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.507640 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.507664 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:38Z","lastTransitionTime":"2025-11-26T22:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.517426 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.517476 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.517521 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.517653 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.517867 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:38 crc kubenswrapper[5008]: E1126 22:39:38.518049 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.611181 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.611550 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.611841 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.612110 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.612321 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:38Z","lastTransitionTime":"2025-11-26T22:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.715251 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.715312 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.715329 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.715352 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.715369 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:38Z","lastTransitionTime":"2025-11-26T22:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.818907 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.818960 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.819000 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.819023 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.819040 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:38Z","lastTransitionTime":"2025-11-26T22:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.906745 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.920081 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.921846 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.921881 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.921893 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.921910 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.921922 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:38Z","lastTransitionTime":"2025-11-26T22:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.929242 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:38Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.950213 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:38Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.971296 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:38Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:38 crc kubenswrapper[5008]: I1126 22:39:38.990684 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:38Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.024846 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.024904 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.024922 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.024948 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.024994 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:39Z","lastTransitionTime":"2025-11-26T22:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.028877 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"Namespace event handler 1 for removal\\\\nI1126 22:39:20.777606 6466 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 22:39:20.777621 6466 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 22:39:20.777641 6466 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:20.777668 6466 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:20.777697 6466 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 22:39:20.777736 6466 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:20.777750 6466 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:20.777765 6466 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 22:39:20.777776 6466 factory.go:656] Stopping watch factory\\\\nI1126 22:39:20.777807 6466 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:20.777820 6466 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:20.777832 6466 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:20.777895 6466 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 22:39:20.777941 6466 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:20.777995 6466 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:20.778068 6466 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.050127 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.067073 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.097711 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.120102 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.127534 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.127574 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.127585 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.127603 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.127614 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:39Z","lastTransitionTime":"2025-11-26T22:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.137488 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.150938 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.162905 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.176518 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.191764 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.209058 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.224645 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.229890 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.230152 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.230338 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.230595 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.230781 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:39Z","lastTransitionTime":"2025-11-26T22:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.239811 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.333897 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.334289 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.334497 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.334658 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.334805 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:39Z","lastTransitionTime":"2025-11-26T22:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.438460 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.438514 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.438530 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.438554 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.438573 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:39Z","lastTransitionTime":"2025-11-26T22:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.517667 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:39 crc kubenswrapper[5008]: E1126 22:39:39.517925 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.519079 5008 scope.go:117] "RemoveContainer" containerID="c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.541622 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.541687 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.541704 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.541729 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.541746 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:39Z","lastTransitionTime":"2025-11-26T22:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.649499 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.649940 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.649955 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.649993 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.650006 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:39Z","lastTransitionTime":"2025-11-26T22:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.753036 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.753063 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.753071 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.753083 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.753092 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:39Z","lastTransitionTime":"2025-11-26T22:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.855835 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.855892 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.855913 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.855941 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.856001 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:39Z","lastTransitionTime":"2025-11-26T22:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.915752 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/1.log" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.920707 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerStarted","Data":"03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb"} Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.921480 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.943789 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.958824 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.958886 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.958910 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.958941 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.958992 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:39Z","lastTransitionTime":"2025-11-26T22:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.965327 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:39 crc kubenswrapper[5008]: I1126 22:39:39.985314 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:39Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.020485 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"Namespace event handler 1 for removal\\\\nI1126 22:39:20.777606 6466 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 22:39:20.777621 6466 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 22:39:20.777641 6466 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:20.777668 6466 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:20.777697 6466 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 22:39:20.777736 6466 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:20.777750 6466 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:20.777765 6466 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 22:39:20.777776 6466 factory.go:656] Stopping watch factory\\\\nI1126 22:39:20.777807 6466 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:20.777820 6466 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:20.777832 6466 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:20.777895 6466 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 22:39:20.777941 6466 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:20.777995 6466 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:20.778068 6466 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.037488 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.055413 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.061281 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.061342 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.061359 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.061385 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.061411 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:40Z","lastTransitionTime":"2025-11-26T22:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.078255 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.100657 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"791021bb-9ed3-4c4f-8081-2f84615bad85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709d4c7c869a6123fb3da1ed7bdc57c002e11b9eabcd3a44112be80f50f101a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94ef8187fdbc6022fe2f451e4d267e0311448a6820236175adce557f66f2bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://071f58b25e93d75bba1ef6b9123717c1e8887e869077efebf70fc55f91d8fe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.121575 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.135063 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.153162 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.165181 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.173946 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.182606 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.182632 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.182640 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.182654 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.182665 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:40Z","lastTransitionTime":"2025-11-26T22:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.186332 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.203077 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.216076 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.229054 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.241383 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.285480 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.285547 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.285559 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.285576 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.285588 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:40Z","lastTransitionTime":"2025-11-26T22:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.388213 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.388263 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.388274 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.388294 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.388306 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:40Z","lastTransitionTime":"2025-11-26T22:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.491120 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.491176 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.491191 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.491212 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.491227 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:40Z","lastTransitionTime":"2025-11-26T22:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.517850 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.517848 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.518047 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:40 crc kubenswrapper[5008]: E1126 22:39:40.518149 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:40 crc kubenswrapper[5008]: E1126 22:39:40.518306 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:40 crc kubenswrapper[5008]: E1126 22:39:40.518458 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.593599 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.593659 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.593674 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.593695 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.593714 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:40Z","lastTransitionTime":"2025-11-26T22:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.696624 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.697047 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.697064 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.697089 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.697107 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:40Z","lastTransitionTime":"2025-11-26T22:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.800190 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.800249 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.800279 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.800303 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.800321 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:40Z","lastTransitionTime":"2025-11-26T22:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.903757 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.903841 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.903866 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.903948 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.904018 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:40Z","lastTransitionTime":"2025-11-26T22:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.927658 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/2.log" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.929010 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/1.log" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.935256 5008 generic.go:334] "Generic (PLEG): container finished" podID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerID="03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb" exitCode=1 Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.935355 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb"} Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.935427 5008 scope.go:117] "RemoveContainer" containerID="c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.936553 5008 scope.go:117] "RemoveContainer" containerID="03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb" Nov 26 22:39:40 crc kubenswrapper[5008]: E1126 22:39:40.936758 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.970279 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.983345 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"791021bb-9ed3-4c4f-8081-2f84615bad85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709d4c7c869a6123fb3da1ed7bdc57c002e11b9eabcd3a44112be80f50f101a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94ef8187fdbc6022fe2f451e4d267e0311448a6820236175adce557f66f2bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://071f58b25e93d75bba1ef6b9123717c1e8887e869077efebf70fc55f91d8fe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:40 crc kubenswrapper[5008]: I1126 22:39:40.998222 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:40Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.006250 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.006287 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.006298 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.006316 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.006329 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:41Z","lastTransitionTime":"2025-11-26T22:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.012148 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.025343 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.037265 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.048388 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.060763 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.077334 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.097764 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.109109 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.109151 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.109163 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.109179 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.109191 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:41Z","lastTransitionTime":"2025-11-26T22:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.121138 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.149924 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c836a1b07faa5468a8cd785a511d5bddc41244629fbd1bfe0e79f30a6df09ded\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"message\\\":\\\"Namespace event handler 1 for removal\\\\nI1126 22:39:20.777606 6466 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 22:39:20.777621 6466 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 22:39:20.777641 6466 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:20.777668 6466 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 22:39:20.777697 6466 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 22:39:20.777736 6466 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1126 22:39:20.777750 6466 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1126 22:39:20.777765 6466 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 22:39:20.777776 6466 factory.go:656] Stopping watch factory\\\\nI1126 22:39:20.777807 6466 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1126 22:39:20.777820 6466 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 22:39:20.777832 6466 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1126 22:39:20.777895 6466 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 22:39:20.777941 6466 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:20.777995 6466 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:20.778068 6466 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:40Z\\\",\\\"message\\\":\\\"empting retry of *v1.Pod openshift-multus/network-metrics-daemon-xplkg before timer (time: 2025-11-26 22:39:41.876747616 +0000 UTC m=+1.899387262): skip\\\\nI1126 22:39:40.604512 6689 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1126 22:39:40.604531 6689 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1126 22:39:40.604547 6689 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1126 22:39:40.604574 6689 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 145.815µs)\\\\nI1126 22:39:40.604712 6689 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 22:39:40.604782 6689 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 22:39:40.604820 6689 factory.go:656] Stopping watch factory\\\\nI1126 22:39:40.604827 6689 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:40.604838 6689 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:40.604859 6689 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 22:39:40.604886 6689 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:40.604961 6689 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.165631 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.178416 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.193438 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.209946 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.212864 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.212921 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.212935 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.212952 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.213264 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:41Z","lastTransitionTime":"2025-11-26T22:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.222658 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.234684 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.316660 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.316727 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.316750 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.316782 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.316804 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:41Z","lastTransitionTime":"2025-11-26T22:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.419563 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.419602 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.419612 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.419628 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.419640 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:41Z","lastTransitionTime":"2025-11-26T22:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.518110 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:41 crc kubenswrapper[5008]: E1126 22:39:41.518398 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.523521 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.523578 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.523596 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.523619 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.523640 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:41Z","lastTransitionTime":"2025-11-26T22:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.626269 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.626324 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.626341 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.626365 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.626381 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:41Z","lastTransitionTime":"2025-11-26T22:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.729383 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.729467 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.729492 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.729525 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.729551 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:41Z","lastTransitionTime":"2025-11-26T22:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.833013 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.833095 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.833122 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.833154 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.833178 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:41Z","lastTransitionTime":"2025-11-26T22:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.936893 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.937338 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.937584 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.938037 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.938082 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:41Z","lastTransitionTime":"2025-11-26T22:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.941390 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/2.log" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.946243 5008 scope.go:117] "RemoveContainer" containerID="03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb" Nov 26 22:39:41 crc kubenswrapper[5008]: E1126 22:39:41.946584 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.968130 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:41 crc kubenswrapper[5008]: I1126 22:39:41.983682 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.001954 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:41Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.023151 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.041682 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.041755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.041774 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.041800 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.041819 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:42Z","lastTransitionTime":"2025-11-26T22:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.045743 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.062713 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.082440 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:40Z\\\",\\\"message\\\":\\\"empting retry of *v1.Pod openshift-multus/network-metrics-daemon-xplkg before timer (time: 2025-11-26 22:39:41.876747616 +0000 UTC m=+1.899387262): skip\\\\nI1126 22:39:40.604512 6689 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1126 22:39:40.604531 6689 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1126 22:39:40.604547 6689 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1126 22:39:40.604574 6689 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 145.815µs)\\\\nI1126 22:39:40.604712 6689 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 22:39:40.604782 6689 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 22:39:40.604820 6689 factory.go:656] Stopping watch factory\\\\nI1126 22:39:40.604827 6689 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:40.604838 6689 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:40.604859 6689 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 22:39:40.604886 6689 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:40.604961 6689 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.116190 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.144270 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.144314 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.144323 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.144339 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.144347 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:42Z","lastTransitionTime":"2025-11-26T22:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.144348 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"791021bb-9ed3-4c4f-8081-2f84615bad85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709d4c7c869a6123fb3da1ed7bdc57c002e11b9eabcd3a44112be80f50f101a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94ef8187fdbc6022fe2f451e4d267e0311448a6820236175adce557f66f2bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://071f58b25e93d75bba1ef6b9123717c1e8887e869077efebf70fc55f91d8fe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.156725 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.168528 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.180667 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.190491 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.200093 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.209765 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.225741 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.236831 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.246200 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.246238 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.246249 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.246265 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.246277 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:42Z","lastTransitionTime":"2025-11-26T22:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.255787 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:42Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.349092 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.349158 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.349175 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.349200 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.349218 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:42Z","lastTransitionTime":"2025-11-26T22:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.452633 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.452680 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.452692 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.452710 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.452721 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:42Z","lastTransitionTime":"2025-11-26T22:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.517693 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.517764 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.517720 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:42 crc kubenswrapper[5008]: E1126 22:39:42.517932 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:42 crc kubenswrapper[5008]: E1126 22:39:42.518095 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:42 crc kubenswrapper[5008]: E1126 22:39:42.518211 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.555892 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.555944 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.555989 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.556013 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.556031 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:42Z","lastTransitionTime":"2025-11-26T22:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.659260 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.659346 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.659376 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.659409 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.659431 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:42Z","lastTransitionTime":"2025-11-26T22:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.762304 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.762369 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.762387 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.762412 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.762433 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:42Z","lastTransitionTime":"2025-11-26T22:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.865719 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.865774 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.865792 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.865816 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.865834 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:42Z","lastTransitionTime":"2025-11-26T22:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.968563 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.968595 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.968606 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.968620 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:42 crc kubenswrapper[5008]: I1126 22:39:42.968631 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:42Z","lastTransitionTime":"2025-11-26T22:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.071716 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.071782 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.071804 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.071827 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.071844 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:43Z","lastTransitionTime":"2025-11-26T22:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.174697 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.174762 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.174782 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.174808 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.174857 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:43Z","lastTransitionTime":"2025-11-26T22:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.277688 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.277752 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.277771 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.277849 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.277881 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:43Z","lastTransitionTime":"2025-11-26T22:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.381259 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.381300 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.381310 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.381325 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.381336 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:43Z","lastTransitionTime":"2025-11-26T22:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.479207 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.479248 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.479261 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.479277 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.479288 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:43Z","lastTransitionTime":"2025-11-26T22:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:43 crc kubenswrapper[5008]: E1126 22:39:43.501721 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:43Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.507339 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.507459 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.507487 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.507519 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.507543 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:43Z","lastTransitionTime":"2025-11-26T22:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.517728 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:43 crc kubenswrapper[5008]: E1126 22:39:43.517901 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:43 crc kubenswrapper[5008]: E1126 22:39:43.531308 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:43Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.536871 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.536953 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.537006 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.537037 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.537059 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:43Z","lastTransitionTime":"2025-11-26T22:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:43 crc kubenswrapper[5008]: E1126 22:39:43.556682 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:43Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.561675 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.561890 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.562076 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.562234 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.562368 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:43Z","lastTransitionTime":"2025-11-26T22:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:43 crc kubenswrapper[5008]: E1126 22:39:43.584564 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:43Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.589142 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.589243 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.589304 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.589330 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.589385 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:43Z","lastTransitionTime":"2025-11-26T22:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:43 crc kubenswrapper[5008]: E1126 22:39:43.609477 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:43Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:43 crc kubenswrapper[5008]: E1126 22:39:43.609812 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.612074 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.612128 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.612145 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.612168 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.612184 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:43Z","lastTransitionTime":"2025-11-26T22:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.714527 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.714577 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.714592 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.714612 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.714631 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:43Z","lastTransitionTime":"2025-11-26T22:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.816717 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.816790 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.816814 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.816844 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.816868 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:43Z","lastTransitionTime":"2025-11-26T22:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.919814 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.919889 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.919904 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.919922 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:43 crc kubenswrapper[5008]: I1126 22:39:43.919936 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:43Z","lastTransitionTime":"2025-11-26T22:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.022986 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.023058 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.023070 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.023092 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.023105 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:44Z","lastTransitionTime":"2025-11-26T22:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.126269 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.126314 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.126326 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.126340 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.126353 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:44Z","lastTransitionTime":"2025-11-26T22:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.228816 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.228847 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.228859 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.228873 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.228884 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:44Z","lastTransitionTime":"2025-11-26T22:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.331199 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.331246 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.331257 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.331273 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.331284 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:44Z","lastTransitionTime":"2025-11-26T22:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.433738 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.433815 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.433834 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.433866 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.433891 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:44Z","lastTransitionTime":"2025-11-26T22:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.517696 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.517741 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.517784 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:44 crc kubenswrapper[5008]: E1126 22:39:44.517915 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:44 crc kubenswrapper[5008]: E1126 22:39:44.518092 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:44 crc kubenswrapper[5008]: E1126 22:39:44.518282 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.536539 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.536581 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.536599 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.536619 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.536636 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:44Z","lastTransitionTime":"2025-11-26T22:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.638915 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.639006 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.639035 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.639066 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.639089 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:44Z","lastTransitionTime":"2025-11-26T22:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.741632 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.741695 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.741717 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.741748 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.741770 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:44Z","lastTransitionTime":"2025-11-26T22:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.845733 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.845808 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.845846 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.845878 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.845900 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:44Z","lastTransitionTime":"2025-11-26T22:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.948435 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.948489 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.948506 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.948531 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:44 crc kubenswrapper[5008]: I1126 22:39:44.948547 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:44Z","lastTransitionTime":"2025-11-26T22:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.051921 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.052042 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.052062 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.052090 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.052108 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:45Z","lastTransitionTime":"2025-11-26T22:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.154668 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.154726 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.154743 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.154768 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.154785 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:45Z","lastTransitionTime":"2025-11-26T22:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.257855 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.258437 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.258480 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.258510 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.258528 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:45Z","lastTransitionTime":"2025-11-26T22:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.367667 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.367732 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.367746 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.367763 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.367775 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:45Z","lastTransitionTime":"2025-11-26T22:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.470855 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.470938 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.470998 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.471032 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.471059 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:45Z","lastTransitionTime":"2025-11-26T22:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.517512 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:45 crc kubenswrapper[5008]: E1126 22:39:45.517825 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.539417 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.555865 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.571020 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.574060 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.574230 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.574315 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.574435 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.574526 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:45Z","lastTransitionTime":"2025-11-26T22:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.586193 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.608155 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.624606 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.639657 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.651137 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.661673 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.673660 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.677558 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.677595 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.677608 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.677625 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.677637 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:45Z","lastTransitionTime":"2025-11-26T22:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.693416 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.708194 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.719427 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.740454 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:40Z\\\",\\\"message\\\":\\\"empting retry of *v1.Pod openshift-multus/network-metrics-daemon-xplkg before timer (time: 2025-11-26 22:39:41.876747616 +0000 UTC m=+1.899387262): skip\\\\nI1126 22:39:40.604512 6689 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1126 22:39:40.604531 6689 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1126 22:39:40.604547 6689 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1126 22:39:40.604574 6689 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 145.815µs)\\\\nI1126 22:39:40.604712 6689 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 22:39:40.604782 6689 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 22:39:40.604820 6689 factory.go:656] Stopping watch factory\\\\nI1126 22:39:40.604827 6689 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:40.604838 6689 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:40.604859 6689 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 22:39:40.604886 6689 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:40.604961 6689 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.767482 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.780393 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.780472 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.780498 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.780535 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.780561 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:45Z","lastTransitionTime":"2025-11-26T22:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.781072 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"791021bb-9ed3-4c4f-8081-2f84615bad85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709d4c7c869a6123fb3da1ed7bdc57c002e11b9eabcd3a44112be80f50f101a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94ef8187fdbc6022fe2f451e4d267e0311448a6820236175adce557f66f2bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://071f58b25e93d75bba1ef6b9123717c1e8887e869077efebf70fc55f91d8fe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.797770 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.813760 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:45Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.883089 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.883155 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.883174 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.883196 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.883214 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:45Z","lastTransitionTime":"2025-11-26T22:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.985351 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.985391 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.985404 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.985425 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:45 crc kubenswrapper[5008]: I1126 22:39:45.985443 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:45Z","lastTransitionTime":"2025-11-26T22:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.088531 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.088579 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.088593 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.088612 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.088623 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:46Z","lastTransitionTime":"2025-11-26T22:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.191758 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.191807 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.191823 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.191842 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.191854 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:46Z","lastTransitionTime":"2025-11-26T22:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.294465 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.294507 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.294519 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.294537 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.294549 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:46Z","lastTransitionTime":"2025-11-26T22:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.397351 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.397393 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.397404 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.397421 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.397435 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:46Z","lastTransitionTime":"2025-11-26T22:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.499593 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.499658 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.499681 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.499710 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.499733 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:46Z","lastTransitionTime":"2025-11-26T22:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.517910 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.518030 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:46 crc kubenswrapper[5008]: E1126 22:39:46.518092 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:46 crc kubenswrapper[5008]: E1126 22:39:46.518156 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.518218 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:46 crc kubenswrapper[5008]: E1126 22:39:46.518432 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.603392 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.603485 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.603509 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.603536 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.603561 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:46Z","lastTransitionTime":"2025-11-26T22:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.706945 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.707059 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.707086 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.707114 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.707136 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:46Z","lastTransitionTime":"2025-11-26T22:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.809701 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.809740 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.809750 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.809766 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.809799 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:46Z","lastTransitionTime":"2025-11-26T22:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.912546 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.912620 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.912638 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.912666 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:46 crc kubenswrapper[5008]: I1126 22:39:46.912686 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:46Z","lastTransitionTime":"2025-11-26T22:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.015537 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.015629 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.015663 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.015693 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.015712 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:47Z","lastTransitionTime":"2025-11-26T22:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.118677 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.118747 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.118772 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.118803 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.118824 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:47Z","lastTransitionTime":"2025-11-26T22:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.222470 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.222519 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.222535 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.222558 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.222574 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:47Z","lastTransitionTime":"2025-11-26T22:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.325774 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.325844 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.325861 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.325888 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.325912 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:47Z","lastTransitionTime":"2025-11-26T22:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.429488 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.429597 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.429625 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.429676 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.429702 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:47Z","lastTransitionTime":"2025-11-26T22:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.518511 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:47 crc kubenswrapper[5008]: E1126 22:39:47.518727 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.536299 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.536364 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.536387 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.536415 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.536435 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:47Z","lastTransitionTime":"2025-11-26T22:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.639635 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.639708 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.639728 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.639753 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.639770 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:47Z","lastTransitionTime":"2025-11-26T22:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.742836 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.742894 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.742917 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.742944 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.743005 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:47Z","lastTransitionTime":"2025-11-26T22:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.845784 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.845864 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.845892 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.845922 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.845945 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:47Z","lastTransitionTime":"2025-11-26T22:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.949614 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.949692 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.949715 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.949744 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:47 crc kubenswrapper[5008]: I1126 22:39:47.949765 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:47Z","lastTransitionTime":"2025-11-26T22:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.052820 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.052893 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.052920 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.052951 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.053015 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:48Z","lastTransitionTime":"2025-11-26T22:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.155334 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.155430 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.155461 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.155494 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.155520 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:48Z","lastTransitionTime":"2025-11-26T22:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.257829 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.257926 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.257955 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.258028 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.258053 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:48Z","lastTransitionTime":"2025-11-26T22:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.360847 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.360908 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.360922 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.360939 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.360952 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:48Z","lastTransitionTime":"2025-11-26T22:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.463458 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.463543 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.463565 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.463597 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.463620 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:48Z","lastTransitionTime":"2025-11-26T22:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.517706 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.517795 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.517717 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:48 crc kubenswrapper[5008]: E1126 22:39:48.517931 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:48 crc kubenswrapper[5008]: E1126 22:39:48.518100 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:48 crc kubenswrapper[5008]: E1126 22:39:48.518225 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.565997 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.566046 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.566059 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.566077 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.566091 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:48Z","lastTransitionTime":"2025-11-26T22:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.668759 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.668804 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.668816 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.668833 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.668846 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:48Z","lastTransitionTime":"2025-11-26T22:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.772027 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.772094 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.772112 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.772135 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.772159 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:48Z","lastTransitionTime":"2025-11-26T22:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.875182 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.875248 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.875267 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.875291 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.875307 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:48Z","lastTransitionTime":"2025-11-26T22:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.977987 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.978028 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.978040 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.978061 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:48 crc kubenswrapper[5008]: I1126 22:39:48.978072 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:48Z","lastTransitionTime":"2025-11-26T22:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.081209 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.081247 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.081256 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.081274 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.081286 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:49Z","lastTransitionTime":"2025-11-26T22:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.184010 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.184073 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.184084 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.184121 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.184133 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:49Z","lastTransitionTime":"2025-11-26T22:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.286648 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.286683 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.286692 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.286705 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.286714 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:49Z","lastTransitionTime":"2025-11-26T22:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.390730 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.390788 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.390808 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.390836 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.390859 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:49Z","lastTransitionTime":"2025-11-26T22:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.493829 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.493895 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.493907 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.493923 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.493932 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:49Z","lastTransitionTime":"2025-11-26T22:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.517724 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:49 crc kubenswrapper[5008]: E1126 22:39:49.518011 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.596910 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.597026 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.597051 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.597076 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.597095 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:49Z","lastTransitionTime":"2025-11-26T22:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.699228 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.699270 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.699301 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.699317 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.699332 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:49Z","lastTransitionTime":"2025-11-26T22:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.802498 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.802590 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.802609 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.802667 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.802686 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:49Z","lastTransitionTime":"2025-11-26T22:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.911070 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.911115 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.911125 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.911140 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:49 crc kubenswrapper[5008]: I1126 22:39:49.911150 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:49Z","lastTransitionTime":"2025-11-26T22:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.013431 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.013489 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.013506 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.013529 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.013547 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:50Z","lastTransitionTime":"2025-11-26T22:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.118186 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.118283 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.118319 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.118355 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.118375 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:50Z","lastTransitionTime":"2025-11-26T22:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.221273 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.221326 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.221338 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.221357 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.221372 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:50Z","lastTransitionTime":"2025-11-26T22:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.323862 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.323923 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.323936 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.323990 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.324012 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:50Z","lastTransitionTime":"2025-11-26T22:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.427035 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.427105 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.427138 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.427167 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.427188 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:50Z","lastTransitionTime":"2025-11-26T22:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.518120 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.518200 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:50 crc kubenswrapper[5008]: E1126 22:39:50.518306 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:50 crc kubenswrapper[5008]: E1126 22:39:50.518538 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.518721 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:50 crc kubenswrapper[5008]: E1126 22:39:50.519039 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.529240 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.529435 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.529578 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.529745 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.529885 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:50Z","lastTransitionTime":"2025-11-26T22:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.632740 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.633023 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.633484 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.633672 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.633822 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:50Z","lastTransitionTime":"2025-11-26T22:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.737649 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.738840 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.739046 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.739213 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.739372 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:50Z","lastTransitionTime":"2025-11-26T22:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.842430 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.842522 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.842555 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.842587 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.842608 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:50Z","lastTransitionTime":"2025-11-26T22:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.945081 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.945447 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.945606 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.945761 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:50 crc kubenswrapper[5008]: I1126 22:39:50.945898 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:50Z","lastTransitionTime":"2025-11-26T22:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.048318 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.048353 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.048362 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.048378 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.048389 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:51Z","lastTransitionTime":"2025-11-26T22:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.151549 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.151652 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.151685 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.151712 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.151728 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:51Z","lastTransitionTime":"2025-11-26T22:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.254238 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.254271 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.254282 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.254297 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.254308 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:51Z","lastTransitionTime":"2025-11-26T22:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.356997 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.357034 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.357043 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.357059 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.357070 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:51Z","lastTransitionTime":"2025-11-26T22:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.459062 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.459154 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.459166 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.459183 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.459195 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:51Z","lastTransitionTime":"2025-11-26T22:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.518303 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:51 crc kubenswrapper[5008]: E1126 22:39:51.518480 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.532812 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.561604 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.561635 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.561643 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.561656 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.561664 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:51Z","lastTransitionTime":"2025-11-26T22:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.664422 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.664464 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.664476 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.664492 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.664504 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:51Z","lastTransitionTime":"2025-11-26T22:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.767327 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.767354 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.767362 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.767373 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.767381 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:51Z","lastTransitionTime":"2025-11-26T22:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.870804 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.870864 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.870885 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.870909 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.870927 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:51Z","lastTransitionTime":"2025-11-26T22:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.974027 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.974082 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.974107 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.974132 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:51 crc kubenswrapper[5008]: I1126 22:39:51.974149 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:51Z","lastTransitionTime":"2025-11-26T22:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.076849 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.076939 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.076960 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.077024 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.077041 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:52Z","lastTransitionTime":"2025-11-26T22:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.179736 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.179787 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.179803 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.179824 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.179840 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:52Z","lastTransitionTime":"2025-11-26T22:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.282551 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.282592 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.282605 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.282623 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.282637 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:52Z","lastTransitionTime":"2025-11-26T22:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.384941 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.385005 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.385020 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.385036 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.385048 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:52Z","lastTransitionTime":"2025-11-26T22:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.488117 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.488197 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.488217 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.488244 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.488263 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:52Z","lastTransitionTime":"2025-11-26T22:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.517834 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.517889 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:52 crc kubenswrapper[5008]: E1126 22:39:52.518009 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.518116 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:52 crc kubenswrapper[5008]: E1126 22:39:52.518243 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:52 crc kubenswrapper[5008]: E1126 22:39:52.522456 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.591219 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.591270 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.591283 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.591299 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.591311 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:52Z","lastTransitionTime":"2025-11-26T22:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.693468 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.693504 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.693514 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.693529 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.693538 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:52Z","lastTransitionTime":"2025-11-26T22:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.795859 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.796007 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.796088 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.796140 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.796252 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:52Z","lastTransitionTime":"2025-11-26T22:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.898687 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.898740 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.898759 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.898782 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:52 crc kubenswrapper[5008]: I1126 22:39:52.898800 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:52Z","lastTransitionTime":"2025-11-26T22:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.000668 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.000710 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.000727 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.000746 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.000764 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.102214 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.102255 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.102266 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.102279 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.102289 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.204213 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.204259 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.204273 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.204289 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.204301 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.307342 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.307437 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.307456 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.307485 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.307504 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.410503 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.410558 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.410575 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.410599 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.410615 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.513456 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.513501 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.513517 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.513538 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.513551 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.517707 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:53 crc kubenswrapper[5008]: E1126 22:39:53.517947 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.519392 5008 scope.go:117] "RemoveContainer" containerID="03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb" Nov 26 22:39:53 crc kubenswrapper[5008]: E1126 22:39:53.519757 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.616054 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.616103 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.616114 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.616134 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.616147 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.719832 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.719863 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.719872 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.719886 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.719895 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.822433 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.822472 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.822483 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.822500 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.822512 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.838235 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:53 crc kubenswrapper[5008]: E1126 22:39:53.838426 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:53 crc kubenswrapper[5008]: E1126 22:39:53.838954 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs podName:7adf9a69-5de6-4710-b394-968387df9ae6 nodeName:}" failed. No retries permitted until 2025-11-26 22:40:25.838925589 +0000 UTC m=+101.251619631 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs") pod "network-metrics-daemon-xplkg" (UID: "7adf9a69-5de6-4710-b394-968387df9ae6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.924492 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.924533 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.924544 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.924556 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.924567 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.935877 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.935903 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.935912 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.935926 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.935936 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: E1126 22:39:53.949530 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:53Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.952834 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.952863 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.952885 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.952897 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.952905 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: E1126 22:39:53.964653 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:53Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.967800 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.967823 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.967832 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.967844 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.967852 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: E1126 22:39:53.981387 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:53Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.985043 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.985073 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.985084 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.985096 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:53 crc kubenswrapper[5008]: I1126 22:39:53.985106 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:53Z","lastTransitionTime":"2025-11-26T22:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:53 crc kubenswrapper[5008]: E1126 22:39:53.998510 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:53Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.001540 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.001580 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.001593 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.001604 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.001613 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:54Z","lastTransitionTime":"2025-11-26T22:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:54 crc kubenswrapper[5008]: E1126 22:39:54.014055 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:39:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:54Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:54 crc kubenswrapper[5008]: E1126 22:39:54.014202 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.026628 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.026655 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.026665 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.026681 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.026692 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:54Z","lastTransitionTime":"2025-11-26T22:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.129068 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.129102 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.129120 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.129142 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.129158 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:54Z","lastTransitionTime":"2025-11-26T22:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.231668 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.231705 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.232082 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.232119 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.232135 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:54Z","lastTransitionTime":"2025-11-26T22:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.335897 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.335958 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.336013 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.336041 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.336065 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:54Z","lastTransitionTime":"2025-11-26T22:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.437880 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.437910 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.437917 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.437929 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.437938 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:54Z","lastTransitionTime":"2025-11-26T22:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.517450 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.517450 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:54 crc kubenswrapper[5008]: E1126 22:39:54.517568 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:54 crc kubenswrapper[5008]: E1126 22:39:54.517623 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.517462 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:54 crc kubenswrapper[5008]: E1126 22:39:54.517672 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.539905 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.539934 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.539947 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.539978 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.539990 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:54Z","lastTransitionTime":"2025-11-26T22:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.643080 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.643131 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.643149 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.643170 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.643186 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:54Z","lastTransitionTime":"2025-11-26T22:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.745334 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.745372 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.745384 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.745398 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.745409 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:54Z","lastTransitionTime":"2025-11-26T22:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.848352 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.848408 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.848419 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.848437 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.848449 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:54Z","lastTransitionTime":"2025-11-26T22:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.951317 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.951350 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.951362 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.951384 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:54 crc kubenswrapper[5008]: I1126 22:39:54.951395 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:54Z","lastTransitionTime":"2025-11-26T22:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.053765 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.053800 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.053811 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.053826 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.053837 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:55Z","lastTransitionTime":"2025-11-26T22:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.156543 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.156587 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.156599 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.156615 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.156626 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:55Z","lastTransitionTime":"2025-11-26T22:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.258501 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.258541 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.258552 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.258568 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.258579 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:55Z","lastTransitionTime":"2025-11-26T22:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.361500 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.361557 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.361575 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.361601 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.361620 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:55Z","lastTransitionTime":"2025-11-26T22:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.464290 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.464322 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.464333 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.464347 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.464360 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:55Z","lastTransitionTime":"2025-11-26T22:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.517494 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:55 crc kubenswrapper[5008]: E1126 22:39:55.517776 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.532873 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"791021bb-9ed3-4c4f-8081-2f84615bad85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709d4c7c869a6123fb3da1ed7bdc57c002e11b9eabcd3a44112be80f50f101a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94ef8187fdbc6022fe2f451e4d267e0311448a6820236175adce557f66f2bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://071f58b25e93d75bba1ef6b9123717c1e8887e869077efebf70fc55f91d8fe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.556397 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.566561 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.566856 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.567109 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.567369 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.567580 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:55Z","lastTransitionTime":"2025-11-26T22:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.572072 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.594567 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.612656 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.626594 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.639805 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.649081 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.657889 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75e29703-31bc-4758-94f6-81214c6d4943\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8a32772c75918a8f47660b624d60692798964409588b64b0ba25f26d3b061a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.669033 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.670357 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.670377 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.670386 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.670398 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.670406 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:55Z","lastTransitionTime":"2025-11-26T22:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.682426 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.693750 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.709008 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.721015 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.736387 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.756682 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:40Z\\\",\\\"message\\\":\\\"empting retry of *v1.Pod openshift-multus/network-metrics-daemon-xplkg before timer (time: 2025-11-26 22:39:41.876747616 +0000 UTC m=+1.899387262): skip\\\\nI1126 22:39:40.604512 6689 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1126 22:39:40.604531 6689 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1126 22:39:40.604547 6689 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1126 22:39:40.604574 6689 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 145.815µs)\\\\nI1126 22:39:40.604712 6689 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 22:39:40.604782 6689 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 22:39:40.604820 6689 factory.go:656] Stopping watch factory\\\\nI1126 22:39:40.604827 6689 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:40.604838 6689 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:40.604859 6689 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 22:39:40.604886 6689 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:40.604961 6689 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.772131 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.772271 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.772296 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.772308 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.772322 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.772330 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:55Z","lastTransitionTime":"2025-11-26T22:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.782169 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.792766 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:55Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.874868 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.874897 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.874906 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.874917 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.874926 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:55Z","lastTransitionTime":"2025-11-26T22:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.977023 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.977071 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.977084 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.977101 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:55 crc kubenswrapper[5008]: I1126 22:39:55.977114 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:55Z","lastTransitionTime":"2025-11-26T22:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.079543 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.079581 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.079591 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.079607 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.079648 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:56Z","lastTransitionTime":"2025-11-26T22:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.181900 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.181951 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.182003 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.182027 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.182045 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:56Z","lastTransitionTime":"2025-11-26T22:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.284536 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.284627 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.284672 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.284696 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.284712 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:56Z","lastTransitionTime":"2025-11-26T22:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.387548 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.387586 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.387595 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.387612 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.387621 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:56Z","lastTransitionTime":"2025-11-26T22:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.489936 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.490014 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.490029 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.490052 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.490072 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:56Z","lastTransitionTime":"2025-11-26T22:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.517509 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.517548 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:56 crc kubenswrapper[5008]: E1126 22:39:56.517671 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.517525 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:56 crc kubenswrapper[5008]: E1126 22:39:56.517852 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:56 crc kubenswrapper[5008]: E1126 22:39:56.517998 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.592455 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.592690 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.592758 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.592830 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.592901 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:56Z","lastTransitionTime":"2025-11-26T22:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.695675 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.695722 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.695736 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.695755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.695768 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:56Z","lastTransitionTime":"2025-11-26T22:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.797910 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.797954 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.798004 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.798031 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.798051 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:56Z","lastTransitionTime":"2025-11-26T22:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.901171 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.901246 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.901263 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.901287 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.901304 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:56Z","lastTransitionTime":"2025-11-26T22:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.995158 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4xtd_8509b0e0-c914-44a1-a657-ffb4f5a86c18/kube-multus/0.log" Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.995249 5008 generic.go:334] "Generic (PLEG): container finished" podID="8509b0e0-c914-44a1-a657-ffb4f5a86c18" containerID="56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625" exitCode=1 Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.995295 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4xtd" event={"ID":"8509b0e0-c914-44a1-a657-ffb4f5a86c18","Type":"ContainerDied","Data":"56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625"} Nov 26 22:39:56 crc kubenswrapper[5008]: I1126 22:39:56.995716 5008 scope.go:117] "RemoveContainer" containerID="56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.004392 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.004467 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.004495 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.004528 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.004550 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:57Z","lastTransitionTime":"2025-11-26T22:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.012756 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75e29703-31bc-4758-94f6-81214c6d4943\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8a32772c75918a8f47660b624d60692798964409588b64b0ba25f26d3b061a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.026935 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.037730 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.047117 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.058855 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.073387 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.085600 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.101029 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.107150 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.107186 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.107196 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.107211 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.107223 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:57Z","lastTransitionTime":"2025-11-26T22:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.115863 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.127883 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.138664 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.150118 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.170144 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:40Z\\\",\\\"message\\\":\\\"empting retry of *v1.Pod openshift-multus/network-metrics-daemon-xplkg before timer (time: 2025-11-26 22:39:41.876747616 +0000 UTC m=+1.899387262): skip\\\\nI1126 22:39:40.604512 6689 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1126 22:39:40.604531 6689 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1126 22:39:40.604547 6689 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1126 22:39:40.604574 6689 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 145.815µs)\\\\nI1126 22:39:40.604712 6689 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 22:39:40.604782 6689 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 22:39:40.604820 6689 factory.go:656] Stopping watch factory\\\\nI1126 22:39:40.604827 6689 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:40.604838 6689 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:40.604859 6689 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 22:39:40.604886 6689 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:40.604961 6689 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.187259 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:56Z\\\",\\\"message\\\":\\\"2025-11-26T22:39:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c2e69fa-5c39-46a4-b451-b7f299229962\\\\n2025-11-26T22:39:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c2e69fa-5c39-46a4-b451-b7f299229962 to /host/opt/cni/bin/\\\\n2025-11-26T22:39:11Z [verbose] multus-daemon started\\\\n2025-11-26T22:39:11Z [verbose] Readiness Indicator file check\\\\n2025-11-26T22:39:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.199831 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.209049 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.209110 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.209122 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.209159 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.209173 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:57Z","lastTransitionTime":"2025-11-26T22:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.223696 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.236986 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"791021bb-9ed3-4c4f-8081-2f84615bad85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709d4c7c869a6123fb3da1ed7bdc57c002e11b9eabcd3a44112be80f50f101a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94ef8187fdbc6022fe2f451e4d267e0311448a6820236175adce557f66f2bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://071f58b25e93d75bba1ef6b9123717c1e8887e869077efebf70fc55f91d8fe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.256374 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.279452 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:57Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.311625 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.311675 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.311685 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.311700 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.311711 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:57Z","lastTransitionTime":"2025-11-26T22:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.413663 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.413709 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.413722 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.413737 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.413746 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:57Z","lastTransitionTime":"2025-11-26T22:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.516775 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.516845 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.516862 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.516886 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.516903 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:57Z","lastTransitionTime":"2025-11-26T22:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.518148 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:57 crc kubenswrapper[5008]: E1126 22:39:57.518415 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.620403 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.620461 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.620503 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.620528 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.620546 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:57Z","lastTransitionTime":"2025-11-26T22:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.722936 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.722989 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.722999 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.723013 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.723022 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:57Z","lastTransitionTime":"2025-11-26T22:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.825225 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.825251 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.825259 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.825271 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.825280 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:57Z","lastTransitionTime":"2025-11-26T22:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.927657 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.927702 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.927715 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.927733 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.927743 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:57Z","lastTransitionTime":"2025-11-26T22:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.998853 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4xtd_8509b0e0-c914-44a1-a657-ffb4f5a86c18/kube-multus/0.log" Nov 26 22:39:57 crc kubenswrapper[5008]: I1126 22:39:57.998921 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4xtd" event={"ID":"8509b0e0-c914-44a1-a657-ffb4f5a86c18","Type":"ContainerStarted","Data":"7771b2db4cb155a3eaf8d9c456340e0c2fdae6e830412dfb95a69482a8fd39fa"} Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.012162 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.025897 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.029778 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.029853 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.029868 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.029890 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.029907 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:58Z","lastTransitionTime":"2025-11-26T22:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.041389 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.053827 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.071280 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:40Z\\\",\\\"message\\\":\\\"empting retry of *v1.Pod openshift-multus/network-metrics-daemon-xplkg before timer (time: 2025-11-26 22:39:41.876747616 +0000 UTC m=+1.899387262): skip\\\\nI1126 22:39:40.604512 6689 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1126 22:39:40.604531 6689 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1126 22:39:40.604547 6689 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1126 22:39:40.604574 6689 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 145.815µs)\\\\nI1126 22:39:40.604712 6689 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 22:39:40.604782 6689 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 22:39:40.604820 6689 factory.go:656] Stopping watch factory\\\\nI1126 22:39:40.604827 6689 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:40.604838 6689 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:40.604859 6689 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 22:39:40.604886 6689 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:40.604961 6689 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.085235 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7771b2db4cb155a3eaf8d9c456340e0c2fdae6e830412dfb95a69482a8fd39fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:56Z\\\",\\\"message\\\":\\\"2025-11-26T22:39:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c2e69fa-5c39-46a4-b451-b7f299229962\\\\n2025-11-26T22:39:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c2e69fa-5c39-46a4-b451-b7f299229962 to /host/opt/cni/bin/\\\\n2025-11-26T22:39:11Z [verbose] multus-daemon started\\\\n2025-11-26T22:39:11Z [verbose] Readiness Indicator file check\\\\n2025-11-26T22:39:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.095636 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.108373 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.123522 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.132307 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.132338 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.132347 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.132360 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.132369 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:58Z","lastTransitionTime":"2025-11-26T22:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.137386 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.148228 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.165161 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.174388 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"791021bb-9ed3-4c4f-8081-2f84615bad85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709d4c7c869a6123fb3da1ed7bdc57c002e11b9eabcd3a44112be80f50f101a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94ef8187fdbc6022fe2f451e4d267e0311448a6820236175adce557f66f2bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://071f58b25e93d75bba1ef6b9123717c1e8887e869077efebf70fc55f91d8fe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.187288 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.197877 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.210171 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.222630 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75e29703-31bc-4758-94f6-81214c6d4943\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8a32772c75918a8f47660b624d60692798964409588b64b0ba25f26d3b061a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.234896 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.234946 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.234988 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.235012 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.235030 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:58Z","lastTransitionTime":"2025-11-26T22:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.235646 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.245836 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:39:58Z is after 2025-08-24T17:21:41Z" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.337536 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.337577 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.337588 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.337604 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.337615 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:58Z","lastTransitionTime":"2025-11-26T22:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.440309 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.440348 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.440389 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.440407 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.440468 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:58Z","lastTransitionTime":"2025-11-26T22:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.517378 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.517433 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.517432 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:39:58 crc kubenswrapper[5008]: E1126 22:39:58.517550 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:39:58 crc kubenswrapper[5008]: E1126 22:39:58.517634 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:39:58 crc kubenswrapper[5008]: E1126 22:39:58.517827 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.542862 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.542901 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.542912 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.542929 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.542941 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:58Z","lastTransitionTime":"2025-11-26T22:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.644705 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.644772 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.644800 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.644830 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.644852 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:58Z","lastTransitionTime":"2025-11-26T22:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.747502 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.747550 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.747562 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.747583 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.747598 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:58Z","lastTransitionTime":"2025-11-26T22:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.850277 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.850333 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.850349 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.850378 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.850416 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:58Z","lastTransitionTime":"2025-11-26T22:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.953457 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.953532 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.953556 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.953585 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:58 crc kubenswrapper[5008]: I1126 22:39:58.953605 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:58Z","lastTransitionTime":"2025-11-26T22:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.056707 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.056752 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.056762 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.056777 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.056786 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:59Z","lastTransitionTime":"2025-11-26T22:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.161781 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.161851 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.161872 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.161899 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.161926 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:59Z","lastTransitionTime":"2025-11-26T22:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.265239 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.265313 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.265337 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.265368 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.265388 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:59Z","lastTransitionTime":"2025-11-26T22:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.368454 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.368546 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.368575 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.368604 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.368625 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:59Z","lastTransitionTime":"2025-11-26T22:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.471899 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.471959 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.472010 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.472037 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.472058 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:59Z","lastTransitionTime":"2025-11-26T22:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.518392 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:39:59 crc kubenswrapper[5008]: E1126 22:39:59.518636 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.574801 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.574865 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.574882 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.574913 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.574933 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:59Z","lastTransitionTime":"2025-11-26T22:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.677638 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.677685 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.677698 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.677713 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.677724 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:59Z","lastTransitionTime":"2025-11-26T22:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.780743 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.780789 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.780802 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.780819 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.780832 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:59Z","lastTransitionTime":"2025-11-26T22:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.884161 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.884211 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.884228 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.884250 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.884266 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:59Z","lastTransitionTime":"2025-11-26T22:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.988139 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.988695 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.988758 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.988852 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:39:59 crc kubenswrapper[5008]: I1126 22:39:59.988933 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:39:59Z","lastTransitionTime":"2025-11-26T22:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.091312 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.091366 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.091384 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.091407 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.091423 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:00Z","lastTransitionTime":"2025-11-26T22:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.194249 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.194355 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.194375 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.194408 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.194429 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:00Z","lastTransitionTime":"2025-11-26T22:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.298183 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.298260 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.298286 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.298316 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.298341 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:00Z","lastTransitionTime":"2025-11-26T22:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.401084 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.401119 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.401129 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.401146 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.401157 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:00Z","lastTransitionTime":"2025-11-26T22:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.504080 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.504141 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.504159 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.504182 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.504198 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:00Z","lastTransitionTime":"2025-11-26T22:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.517349 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.517376 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:00 crc kubenswrapper[5008]: E1126 22:40:00.517491 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.517515 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:00 crc kubenswrapper[5008]: E1126 22:40:00.517648 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:00 crc kubenswrapper[5008]: E1126 22:40:00.517768 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.606805 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.606881 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.606906 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.606943 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.607001 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:00Z","lastTransitionTime":"2025-11-26T22:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.709782 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.709842 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.709860 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.709901 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.709919 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:00Z","lastTransitionTime":"2025-11-26T22:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.812346 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.812402 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.812421 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.812443 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.812459 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:00Z","lastTransitionTime":"2025-11-26T22:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.914667 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.914737 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.914765 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.914807 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:00 crc kubenswrapper[5008]: I1126 22:40:00.914830 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:00Z","lastTransitionTime":"2025-11-26T22:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.017465 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.017504 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.017515 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.017529 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.017542 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:01Z","lastTransitionTime":"2025-11-26T22:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.120437 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.120506 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.120541 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.120569 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.120589 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:01Z","lastTransitionTime":"2025-11-26T22:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.224121 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.224201 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.224220 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.224249 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.224268 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:01Z","lastTransitionTime":"2025-11-26T22:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.327427 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.327501 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.327523 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.327577 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.327600 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:01Z","lastTransitionTime":"2025-11-26T22:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.431388 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.431483 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.431514 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.431549 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.431572 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:01Z","lastTransitionTime":"2025-11-26T22:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.517863 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:01 crc kubenswrapper[5008]: E1126 22:40:01.518187 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.533810 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.533880 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.533898 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.533926 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.533944 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:01Z","lastTransitionTime":"2025-11-26T22:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.637425 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.637495 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.637512 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.637537 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.637554 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:01Z","lastTransitionTime":"2025-11-26T22:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.740040 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.740095 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.740112 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.740136 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.740154 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:01Z","lastTransitionTime":"2025-11-26T22:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.842478 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.842512 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.842525 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.842540 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.842550 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:01Z","lastTransitionTime":"2025-11-26T22:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.946093 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.946163 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.946184 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.946213 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:01 crc kubenswrapper[5008]: I1126 22:40:01.946235 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:01Z","lastTransitionTime":"2025-11-26T22:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.048762 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.048825 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.048847 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.048875 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.048896 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:02Z","lastTransitionTime":"2025-11-26T22:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.152047 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.152113 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.152130 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.152155 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.152172 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:02Z","lastTransitionTime":"2025-11-26T22:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.254818 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.254883 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.254899 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.254915 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.254926 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:02Z","lastTransitionTime":"2025-11-26T22:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.358109 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.358211 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.358242 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.358278 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.358303 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:02Z","lastTransitionTime":"2025-11-26T22:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.461121 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.461160 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.461171 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.461188 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.461199 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:02Z","lastTransitionTime":"2025-11-26T22:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.518047 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.518196 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:02 crc kubenswrapper[5008]: E1126 22:40:02.518270 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.518312 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:02 crc kubenswrapper[5008]: E1126 22:40:02.518515 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:02 crc kubenswrapper[5008]: E1126 22:40:02.518571 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.564018 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.564079 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.564100 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.564128 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.564150 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:02Z","lastTransitionTime":"2025-11-26T22:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.667023 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.667412 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.667651 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.667838 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.668073 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:02Z","lastTransitionTime":"2025-11-26T22:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.771359 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.771732 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.771893 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.772128 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.772274 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:02Z","lastTransitionTime":"2025-11-26T22:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.875336 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.875403 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.875421 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.875447 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.875466 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:02Z","lastTransitionTime":"2025-11-26T22:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.979403 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.979791 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.980038 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.980253 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:02 crc kubenswrapper[5008]: I1126 22:40:02.980465 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:02Z","lastTransitionTime":"2025-11-26T22:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.083251 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.083349 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.083379 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.083413 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.083437 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:03Z","lastTransitionTime":"2025-11-26T22:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.186611 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.186689 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.186713 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.186744 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.186767 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:03Z","lastTransitionTime":"2025-11-26T22:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.290272 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.290349 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.290373 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.290402 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.290423 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:03Z","lastTransitionTime":"2025-11-26T22:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.393794 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.393853 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.393872 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.393894 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.393914 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:03Z","lastTransitionTime":"2025-11-26T22:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.497037 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.497085 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.497101 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.497123 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.497141 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:03Z","lastTransitionTime":"2025-11-26T22:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.518111 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:03 crc kubenswrapper[5008]: E1126 22:40:03.518306 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.600914 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.601012 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.601037 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.601067 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.601089 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:03Z","lastTransitionTime":"2025-11-26T22:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.703728 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.703789 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.703805 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.703830 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.703847 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:03Z","lastTransitionTime":"2025-11-26T22:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.807606 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.807698 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.807719 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.807744 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.807761 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:03Z","lastTransitionTime":"2025-11-26T22:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.910458 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.910825 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.911069 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.911291 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:03 crc kubenswrapper[5008]: I1126 22:40:03.911486 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:03Z","lastTransitionTime":"2025-11-26T22:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.014172 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.014536 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.014745 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.014927 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.015147 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.118244 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.118298 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.118327 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.118349 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.118365 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.155117 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.155171 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.155188 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.155211 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.155227 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: E1126 22:40:04.175660 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:04Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.187759 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.187818 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.187838 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.187861 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.187878 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: E1126 22:40:04.209205 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:04Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.215441 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.215496 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.215509 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.215529 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.215542 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: E1126 22:40:04.231304 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:04Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.235165 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.235220 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.235237 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.235266 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.235284 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: E1126 22:40:04.250404 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:04Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.254401 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.254597 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.254739 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.254869 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.255007 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: E1126 22:40:04.271568 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:04Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:04 crc kubenswrapper[5008]: E1126 22:40:04.271679 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.272994 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.273040 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.273053 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.273071 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.273082 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.376568 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.376617 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.376633 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.376656 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.376673 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.480331 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.480395 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.480412 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.480435 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.480452 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.518272 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.518358 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.518446 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:04 crc kubenswrapper[5008]: E1126 22:40:04.518587 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:04 crc kubenswrapper[5008]: E1126 22:40:04.518779 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:04 crc kubenswrapper[5008]: E1126 22:40:04.518857 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.519490 5008 scope.go:117] "RemoveContainer" containerID="03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.582691 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.583062 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.583076 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.583094 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.583108 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.685658 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.685709 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.685728 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.685754 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.685772 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.789372 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.789483 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.789508 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.789544 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.789566 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.892572 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.892636 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.892652 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.892676 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.892694 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.995285 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.995324 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.995335 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.995352 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:04 crc kubenswrapper[5008]: I1126 22:40:04.995364 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:04Z","lastTransitionTime":"2025-11-26T22:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.022338 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/2.log" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.025134 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerStarted","Data":"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45"} Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.025561 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.039073 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.060862 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:40Z\\\",\\\"message\\\":\\\"empting retry of *v1.Pod openshift-multus/network-metrics-daemon-xplkg before timer (time: 2025-11-26 22:39:41.876747616 +0000 UTC m=+1.899387262): skip\\\\nI1126 22:39:40.604512 6689 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1126 22:39:40.604531 6689 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1126 22:39:40.604547 6689 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1126 22:39:40.604574 6689 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 145.815µs)\\\\nI1126 22:39:40.604712 6689 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 22:39:40.604782 6689 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 22:39:40.604820 6689 factory.go:656] Stopping watch factory\\\\nI1126 22:39:40.604827 6689 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:40.604838 6689 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:40.604859 6689 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 22:39:40.604886 6689 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:40.604961 6689 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:40:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.075366 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7771b2db4cb155a3eaf8d9c456340e0c2fdae6e830412dfb95a69482a8fd39fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:56Z\\\",\\\"message\\\":\\\"2025-11-26T22:39:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c2e69fa-5c39-46a4-b451-b7f299229962\\\\n2025-11-26T22:39:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c2e69fa-5c39-46a4-b451-b7f299229962 to /host/opt/cni/bin/\\\\n2025-11-26T22:39:11Z [verbose] multus-daemon started\\\\n2025-11-26T22:39:11Z [verbose] Readiness Indicator file check\\\\n2025-11-26T22:39:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.089489 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.098166 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.098212 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.098225 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.098246 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.098263 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:05Z","lastTransitionTime":"2025-11-26T22:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.106116 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.122403 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.142784 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.160851 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.194980 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.200344 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.200379 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.200391 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.200407 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.200418 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:05Z","lastTransitionTime":"2025-11-26T22:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.214204 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"791021bb-9ed3-4c4f-8081-2f84615bad85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709d4c7c869a6123fb3da1ed7bdc57c002e11b9eabcd3a44112be80f50f101a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94ef8187fdbc6022fe2f451e4d267e0311448a6820236175adce557f66f2bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://071f58b25e93d75bba1ef6b9123717c1e8887e869077efebf70fc55f91d8fe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.227431 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.237795 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.249511 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.257656 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75e29703-31bc-4758-94f6-81214c6d4943\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8a32772c75918a8f47660b624d60692798964409588b64b0ba25f26d3b061a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.266456 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.276210 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.292459 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.304272 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.306134 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.306201 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.306216 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.306235 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.306247 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:05Z","lastTransitionTime":"2025-11-26T22:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.321543 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.409558 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.409610 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.409628 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.409652 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.409670 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:05Z","lastTransitionTime":"2025-11-26T22:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.511810 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.511844 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.511858 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.511878 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.511892 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:05Z","lastTransitionTime":"2025-11-26T22:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.518386 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:05 crc kubenswrapper[5008]: E1126 22:40:05.518521 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.541777 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.556298 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.576325 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.610533 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:40Z\\\",\\\"message\\\":\\\"empting retry of *v1.Pod openshift-multus/network-metrics-daemon-xplkg before timer (time: 2025-11-26 22:39:41.876747616 +0000 UTC m=+1.899387262): skip\\\\nI1126 22:39:40.604512 6689 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1126 22:39:40.604531 6689 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1126 22:39:40.604547 6689 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1126 22:39:40.604574 6689 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 145.815µs)\\\\nI1126 22:39:40.604712 6689 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 22:39:40.604782 6689 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 22:39:40.604820 6689 factory.go:656] Stopping watch factory\\\\nI1126 22:39:40.604827 6689 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:40.604838 6689 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:40.604859 6689 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 22:39:40.604886 6689 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:40.604961 6689 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:40:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.614723 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.614814 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.614836 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.614929 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.614953 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:05Z","lastTransitionTime":"2025-11-26T22:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.632696 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7771b2db4cb155a3eaf8d9c456340e0c2fdae6e830412dfb95a69482a8fd39fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:56Z\\\",\\\"message\\\":\\\"2025-11-26T22:39:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c2e69fa-5c39-46a4-b451-b7f299229962\\\\n2025-11-26T22:39:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c2e69fa-5c39-46a4-b451-b7f299229962 to /host/opt/cni/bin/\\\\n2025-11-26T22:39:11Z [verbose] multus-daemon started\\\\n2025-11-26T22:39:11Z [verbose] Readiness Indicator file check\\\\n2025-11-26T22:39:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.644477 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.655601 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.665262 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.675572 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.684764 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.700709 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.710911 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"791021bb-9ed3-4c4f-8081-2f84615bad85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709d4c7c869a6123fb3da1ed7bdc57c002e11b9eabcd3a44112be80f50f101a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94ef8187fdbc6022fe2f451e4d267e0311448a6820236175adce557f66f2bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://071f58b25e93d75bba1ef6b9123717c1e8887e869077efebf70fc55f91d8fe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.717480 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.717514 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.717524 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.717538 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.717547 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:05Z","lastTransitionTime":"2025-11-26T22:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.721208 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.732923 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.743534 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.757118 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75e29703-31bc-4758-94f6-81214c6d4943\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8a32772c75918a8f47660b624d60692798964409588b64b0ba25f26d3b061a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.769523 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.779836 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.789253 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:05Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.820267 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.820306 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.820317 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.820334 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.820346 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:05Z","lastTransitionTime":"2025-11-26T22:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.923138 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.923185 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.923194 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.923207 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:05 crc kubenswrapper[5008]: I1126 22:40:05.923216 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:05Z","lastTransitionTime":"2025-11-26T22:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.025660 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.025716 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.025739 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.025766 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.025787 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:06Z","lastTransitionTime":"2025-11-26T22:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.031072 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/3.log" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.031720 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/2.log" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.035582 5008 generic.go:334] "Generic (PLEG): container finished" podID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerID="902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45" exitCode=1 Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.035626 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45"} Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.035661 5008 scope.go:117] "RemoveContainer" containerID="03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.036762 5008 scope.go:117] "RemoveContainer" containerID="902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45" Nov 26 22:40:06 crc kubenswrapper[5008]: E1126 22:40:06.037127 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.060557 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.072490 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"791021bb-9ed3-4c4f-8081-2f84615bad85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709d4c7c869a6123fb3da1ed7bdc57c002e11b9eabcd3a44112be80f50f101a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94ef8187fdbc6022fe2f451e4d267e0311448a6820236175adce557f66f2bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://071f58b25e93d75bba1ef6b9123717c1e8887e869077efebf70fc55f91d8fe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.088310 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.101411 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.112550 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75e29703-31bc-4758-94f6-81214c6d4943\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8a32772c75918a8f47660b624d60692798964409588b64b0ba25f26d3b061a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.122469 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.128454 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.128483 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.128491 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.128504 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.128512 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:06Z","lastTransitionTime":"2025-11-26T22:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.132681 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.141670 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.152194 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.165802 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.177638 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.192946 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.207080 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.222620 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.230281 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.230476 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.230563 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.230683 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.230768 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:06Z","lastTransitionTime":"2025-11-26T22:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.235660 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.248063 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.258224 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.277930 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03fb470c1c196d24ee45b97ad4b657550ec43ccd648eed37d3b5fe4d557193bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:40Z\\\",\\\"message\\\":\\\"empting retry of *v1.Pod openshift-multus/network-metrics-daemon-xplkg before timer (time: 2025-11-26 22:39:41.876747616 +0000 UTC m=+1.899387262): skip\\\\nI1126 22:39:40.604512 6689 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1126 22:39:40.604531 6689 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1126 22:39:40.604547 6689 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1126 22:39:40.604574 6689 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 145.815µs)\\\\nI1126 22:39:40.604712 6689 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 22:39:40.604782 6689 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 22:39:40.604820 6689 factory.go:656] Stopping watch factory\\\\nI1126 22:39:40.604827 6689 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 22:39:40.604838 6689 ovnkube.go:599] Stopped ovnkube\\\\nI1126 22:39:40.604859 6689 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 22:39:40.604886 6689 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 22:39:40.604961 6689 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:40:05Z\\\",\\\"message\\\":\\\"} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1126 22:40:05.418381 7032 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:40:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.291471 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7771b2db4cb155a3eaf8d9c456340e0c2fdae6e830412dfb95a69482a8fd39fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:56Z\\\",\\\"message\\\":\\\"2025-11-26T22:39:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c2e69fa-5c39-46a4-b451-b7f299229962\\\\n2025-11-26T22:39:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c2e69fa-5c39-46a4-b451-b7f299229962 to /host/opt/cni/bin/\\\\n2025-11-26T22:39:11Z [verbose] multus-daemon started\\\\n2025-11-26T22:39:11Z [verbose] Readiness Indicator file check\\\\n2025-11-26T22:39:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:06Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.333635 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.333680 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.333696 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.333716 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.333731 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:06Z","lastTransitionTime":"2025-11-26T22:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.436574 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.436670 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.436698 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.436731 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.436756 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:06Z","lastTransitionTime":"2025-11-26T22:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.517391 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:06 crc kubenswrapper[5008]: E1126 22:40:06.517516 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.517395 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:06 crc kubenswrapper[5008]: E1126 22:40:06.517577 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.517596 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:06 crc kubenswrapper[5008]: E1126 22:40:06.517749 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.538863 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.538897 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.538909 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.538923 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.538935 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:06Z","lastTransitionTime":"2025-11-26T22:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.641692 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.641756 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.641774 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.641799 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.641818 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:06Z","lastTransitionTime":"2025-11-26T22:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.745326 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.745385 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.745401 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.745428 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.745449 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:06Z","lastTransitionTime":"2025-11-26T22:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.848805 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.848884 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.848905 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.848931 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.848951 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:06Z","lastTransitionTime":"2025-11-26T22:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.953327 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.953399 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.953419 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.953445 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:06 crc kubenswrapper[5008]: I1126 22:40:06.953472 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:06Z","lastTransitionTime":"2025-11-26T22:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.041505 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/3.log" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.046451 5008 scope.go:117] "RemoveContainer" containerID="902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45" Nov 26 22:40:07 crc kubenswrapper[5008]: E1126 22:40:07.046708 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.056120 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.056178 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.056196 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.056220 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.056237 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:07Z","lastTransitionTime":"2025-11-26T22:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.063181 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbd164b939d3a5cfad154eb83e9eb80242a276748f10e39af2f7437b910120f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.079458 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8x546" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d86641-6721-4f54-bc04-f188d8d13079\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f9abe45c203d22eb56dfe6c8eb0877163ecddf1a06d25f0e03d6ad34f067957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6s6mv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8x546\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.095754 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cq57l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39db8495-eee6-4117-b69a-aca4f98eb640\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681b5699a3d1a0352c7763470e7ef9359d7e6ab0aee5c79fc9433194eaa24024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t288w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cq57l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.113001 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fe79b6-3204-4cbc-b84f-2f700281ab05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94e2f28b2aa2ba5a228b7d53f35c8176619953fe656fecb08d539d6cd48d5207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e0874f3905e78658040b32877f9ac9910a494c9251878cdd8b31c83d89aa3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pnnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vwnrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.128255 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75e29703-31bc-4758-94f6-81214c6d4943\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8a32772c75918a8f47660b624d60692798964409588b64b0ba25f26d3b061a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d019140fe5457229b7831f556c5de41b4c2dd9f1e070670dcd29469f573fab7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.148260 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c48aef6-de3b-48f7-b2a3-6a63f7d5ca75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c0dad670cd7084c2f38cbf6a25cab9c3354ad19050ea816c8cf6954c535222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://862115a136b35d0a16e0007a4823fde82f134e94809b71dd9a316190becb5ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e14322be000e56427c5bee287583f8b0cda97bc54f603f5f858cb9ffc326f531\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.158324 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.158361 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.158372 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.158388 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.158398 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:07Z","lastTransitionTime":"2025-11-26T22:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.162091 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff27df38-53d5-442d-a931-90a20311879c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec283c269d13922a9bbd46b9249e143495e809b6c7ea94236c5a78676f0f119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28e37fadc084002f8e233cffe7d801d942255476576837f8207f4f00c0c7f78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7366401ae543625c693e546506431485153a776c010deb9a468b1181a5362e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8300142a4300cdc0fc42efc66983e28a7129af556fd2051683e5b256a268c62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e1c7f1870082ef18fff74432eac7a0d30e47da9203c55473868323afe92fd5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a489068121f9a700d13cbbe9e6fab5f190baf961c90516d00752c72d4991e0a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e7076bd9abcb51eb584cde1306d6ca78ce72d5389048ba39b606145eec3883\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prhq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ftgz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.181341 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6251c6-3571-4928-a464-ab761a51d240\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 22:39:00.317683 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 22:39:00.320292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-29155358/tls.crt::/tmp/serving-cert-29155358/tls.key\\\\\\\"\\\\nI1126 22:39:06.042618 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 22:39:06.047293 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 22:39:06.047328 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 22:39:06.047367 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 22:39:06.047377 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 22:39:06.057813 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 22:39:06.057847 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057855 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 22:39:06.057861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 22:39:06.057865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 22:39:06.057869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 22:39:06.057873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 22:39:06.057881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 22:39:06.060036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.198500 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.211996 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc7059e5eb5e45f4cd52dce0bfbe14afbe4b60a0c290e8b9a196254670b7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.228626 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e558d58-c5ad-41f5-930f-36ac26b1a1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a8e583bc84f9cfa2a55b0268d9f6c3353fb9e648139b4da1d42d4f57d5b86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkrlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4qkmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.257051 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e5d1a8-86e0-42e2-a446-8f8938091dc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:40:05Z\\\",\\\"message\\\":\\\"} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1126 22:40:05.418381 7032 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:40:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:39:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sf7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zpbmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.261233 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.261298 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.261323 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.261352 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.261378 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:07Z","lastTransitionTime":"2025-11-26T22:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.277825 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4xtd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8509b0e0-c914-44a1-a657-ffb4f5a86c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7771b2db4cb155a3eaf8d9c456340e0c2fdae6e830412dfb95a69482a8fd39fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T22:39:56Z\\\",\\\"message\\\":\\\"2025-11-26T22:39:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c2e69fa-5c39-46a4-b451-b7f299229962\\\\n2025-11-26T22:39:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c2e69fa-5c39-46a4-b451-b7f299229962 to /host/opt/cni/bin/\\\\n2025-11-26T22:39:11Z [verbose] multus-daemon started\\\\n2025-11-26T22:39:11Z [verbose] Readiness Indicator file check\\\\n2025-11-26T22:39:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T22:39:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jskx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4xtd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.294358 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xplkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7adf9a69-5de6-4710-b394-968387df9ae6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc88z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:39:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xplkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.313455 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.330956 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"791021bb-9ed3-4c4f-8081-2f84615bad85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709d4c7c869a6123fb3da1ed7bdc57c002e11b9eabcd3a44112be80f50f101a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94ef8187fdbc6022fe2f451e4d267e0311448a6820236175adce557f66f2bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://071f58b25e93d75bba1ef6b9123717c1e8887e869077efebf70fc55f91d8fe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494d1ba04fcebe5d94aa29fe638b59cf227a8519e131d1180326200cebc2ee8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.346918 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.362479 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b2182b32073767fdb8d8e83dc4e3a2ec2eb73109ba65b5b3166b14ea1b9841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb4f49ba018bd8917103bcf1d739a6c9b26ab18e454713c5cddd470a4c8749e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.363835 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.363880 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.363899 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.363923 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.363940 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:07Z","lastTransitionTime":"2025-11-26T22:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.414630 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d5f18a7-1613-44d2-b5d8-4bbd996a79b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T22:38:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a79a811072c787d5281e9e96b9a4956369f13aa20f0f926c899446c78d47ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123d1392e452aaabd4765fa488d08a6c78cd9fe265720ec49a8bd2ce0c69ef66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b63aafbed8c8240655e77eaed107eb5be9bcbad4000490c7c30d70c240e49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829aa9bbdeb2c88f764d50490ba575155ec1fca49df70b887c4708b7e50753f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa23af11b8681749124f081ed653995015e7c1c4d7e87d5824632eabef84502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T22:38:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6212e3c208a613e278a448ccd4ba57b2f994cd16e7c0a15fe6dc6b799eac313d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0913c99d769b8c683e4680bc9664fd1b5f76a5407d727135d9a1a4c9d84bfb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc8f8cf38f15b0b82719041100e604aab99abb6b7005720dffa67d2da0995b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T22:38:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T22:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T22:38:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:07Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.466269 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.466306 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.466319 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.466335 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.466347 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:07Z","lastTransitionTime":"2025-11-26T22:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.518479 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:07 crc kubenswrapper[5008]: E1126 22:40:07.518653 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.569113 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.569176 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.569188 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.569203 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.569217 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:07Z","lastTransitionTime":"2025-11-26T22:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.671751 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.672123 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.672282 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.672422 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.672540 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:07Z","lastTransitionTime":"2025-11-26T22:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.775260 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.775315 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.775333 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.775356 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.775372 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:07Z","lastTransitionTime":"2025-11-26T22:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.878205 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.878267 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.878285 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.878309 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.878328 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:07Z","lastTransitionTime":"2025-11-26T22:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.981548 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.981598 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.981613 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.981638 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:07 crc kubenswrapper[5008]: I1126 22:40:07.981655 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:07Z","lastTransitionTime":"2025-11-26T22:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.083907 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.083993 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.084020 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.084047 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.084064 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:08Z","lastTransitionTime":"2025-11-26T22:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.187189 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.187250 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.187269 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.187293 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.187310 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:08Z","lastTransitionTime":"2025-11-26T22:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.290291 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.290360 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.290383 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.290411 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.290433 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:08Z","lastTransitionTime":"2025-11-26T22:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.518074 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.518195 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:08 crc kubenswrapper[5008]: E1126 22:40:08.518400 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.518426 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:08 crc kubenswrapper[5008]: E1126 22:40:08.518532 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:08 crc kubenswrapper[5008]: E1126 22:40:08.518603 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.551780 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.551810 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.551828 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.551844 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.551854 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:08Z","lastTransitionTime":"2025-11-26T22:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.654210 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.654252 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.654266 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.654286 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.654299 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:08Z","lastTransitionTime":"2025-11-26T22:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.756095 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.756172 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.756181 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.756192 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.756201 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:08Z","lastTransitionTime":"2025-11-26T22:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.858085 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.858137 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.858150 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.858166 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.858177 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:08Z","lastTransitionTime":"2025-11-26T22:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.960415 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.960518 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.960544 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.960623 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:08 crc kubenswrapper[5008]: I1126 22:40:08.960646 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:08Z","lastTransitionTime":"2025-11-26T22:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.063694 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.063755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.063777 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.063807 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.063828 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:09Z","lastTransitionTime":"2025-11-26T22:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.167160 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.167223 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.167240 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.167271 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.167291 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:09Z","lastTransitionTime":"2025-11-26T22:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.270435 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.270491 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.270506 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.270524 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.270536 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:09Z","lastTransitionTime":"2025-11-26T22:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.373596 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.373662 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.373679 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.373702 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.373718 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:09Z","lastTransitionTime":"2025-11-26T22:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.476326 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.476397 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.476422 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.476451 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.476471 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:09Z","lastTransitionTime":"2025-11-26T22:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.518495 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:09 crc kubenswrapper[5008]: E1126 22:40:09.518711 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.579813 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.579870 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.579885 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.579907 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.579922 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:09Z","lastTransitionTime":"2025-11-26T22:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.682854 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.682917 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.682938 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.683011 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.683036 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:09Z","lastTransitionTime":"2025-11-26T22:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.786320 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.786388 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.786411 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.786442 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.786463 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:09Z","lastTransitionTime":"2025-11-26T22:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.890387 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.890472 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.890495 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.890558 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.890585 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:09Z","lastTransitionTime":"2025-11-26T22:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.994026 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.994091 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.994116 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.994147 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:09 crc kubenswrapper[5008]: I1126 22:40:09.994170 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:09Z","lastTransitionTime":"2025-11-26T22:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.097500 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.097559 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.097582 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.097612 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.097634 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:10Z","lastTransitionTime":"2025-11-26T22:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.200870 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.200931 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.200949 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.201002 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.201022 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:10Z","lastTransitionTime":"2025-11-26T22:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.269191 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.269405 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.269377167 +0000 UTC m=+149.682071209 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.269483 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.269624 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.269692 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.269678227 +0000 UTC m=+149.682372269 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.303882 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.303955 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.304045 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.304077 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.304100 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:10Z","lastTransitionTime":"2025-11-26T22:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.370937 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.371123 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.371176 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.371274 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.371248287 +0000 UTC m=+149.783942319 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.371325 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.371361 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.371383 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.371470 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.371446543 +0000 UTC m=+149.784140585 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.407239 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.407309 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.407327 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.407354 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.407374 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:10Z","lastTransitionTime":"2025-11-26T22:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.471615 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.472025 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.472108 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.472142 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.472259 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.4722259 +0000 UTC m=+149.884919942 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.510003 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.510069 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.510087 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.510112 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.510128 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:10Z","lastTransitionTime":"2025-11-26T22:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.518302 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.518395 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.518323 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.518487 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.518646 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:10 crc kubenswrapper[5008]: E1126 22:40:10.518779 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.613411 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.613455 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.613467 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.613485 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.613498 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:10Z","lastTransitionTime":"2025-11-26T22:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.716266 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.716357 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.716379 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.716409 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.716446 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:10Z","lastTransitionTime":"2025-11-26T22:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.819188 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.819252 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.819270 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.819295 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.819313 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:10Z","lastTransitionTime":"2025-11-26T22:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.922100 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.922164 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.922187 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.922215 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:10 crc kubenswrapper[5008]: I1126 22:40:10.922237 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:10Z","lastTransitionTime":"2025-11-26T22:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.025270 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.025341 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.025358 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.025384 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.025401 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:11Z","lastTransitionTime":"2025-11-26T22:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.128858 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.128921 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.128944 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.129011 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.129038 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:11Z","lastTransitionTime":"2025-11-26T22:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.231474 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.231527 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.231542 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.231563 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.231578 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:11Z","lastTransitionTime":"2025-11-26T22:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.334678 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.334843 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.334863 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.334888 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.334906 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:11Z","lastTransitionTime":"2025-11-26T22:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.437817 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.437865 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.437874 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.437888 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.437897 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:11Z","lastTransitionTime":"2025-11-26T22:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.518136 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:11 crc kubenswrapper[5008]: E1126 22:40:11.518423 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.540332 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.540385 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.540402 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.540426 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.540445 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:11Z","lastTransitionTime":"2025-11-26T22:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.644044 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.644108 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.644125 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.644149 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.644171 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:11Z","lastTransitionTime":"2025-11-26T22:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.747424 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.747488 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.747507 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.747533 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.747554 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:11Z","lastTransitionTime":"2025-11-26T22:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.850568 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.850627 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.850643 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.850666 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.850685 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:11Z","lastTransitionTime":"2025-11-26T22:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.953120 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.953181 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.953197 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.953239 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:11 crc kubenswrapper[5008]: I1126 22:40:11.953256 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:11Z","lastTransitionTime":"2025-11-26T22:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.056822 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.056891 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.056910 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.056931 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.056949 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:12Z","lastTransitionTime":"2025-11-26T22:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.161172 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.161521 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.161538 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.161563 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.161580 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:12Z","lastTransitionTime":"2025-11-26T22:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.264554 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.264619 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.264637 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.264660 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.264677 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:12Z","lastTransitionTime":"2025-11-26T22:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.367791 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.367857 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.367880 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.367908 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.367930 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:12Z","lastTransitionTime":"2025-11-26T22:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.470759 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.470822 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.470832 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.470857 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.470869 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:12Z","lastTransitionTime":"2025-11-26T22:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.517671 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.517720 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.517930 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:12 crc kubenswrapper[5008]: E1126 22:40:12.518100 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:12 crc kubenswrapper[5008]: E1126 22:40:12.518227 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:12 crc kubenswrapper[5008]: E1126 22:40:12.518378 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.573759 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.573840 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.573862 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.573891 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.573911 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:12Z","lastTransitionTime":"2025-11-26T22:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.677483 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.677539 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.677556 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.677578 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.677597 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:12Z","lastTransitionTime":"2025-11-26T22:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.780578 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.780644 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.780664 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.780688 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.780705 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:12Z","lastTransitionTime":"2025-11-26T22:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.883563 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.883658 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.883687 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.883722 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.883746 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:12Z","lastTransitionTime":"2025-11-26T22:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.985925 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.985953 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.985961 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.985994 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:12 crc kubenswrapper[5008]: I1126 22:40:12.986002 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:12Z","lastTransitionTime":"2025-11-26T22:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.088762 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.088851 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.088881 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.088909 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.088929 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:13Z","lastTransitionTime":"2025-11-26T22:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.192617 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.192699 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.192728 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.192761 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.192783 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:13Z","lastTransitionTime":"2025-11-26T22:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.295807 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.295873 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.295889 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.295913 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.295932 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:13Z","lastTransitionTime":"2025-11-26T22:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.399459 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.399596 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.399621 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.399645 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.399662 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:13Z","lastTransitionTime":"2025-11-26T22:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.504259 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.504324 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.504341 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.504368 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.504387 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:13Z","lastTransitionTime":"2025-11-26T22:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.518077 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:13 crc kubenswrapper[5008]: E1126 22:40:13.518309 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.607996 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.608059 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.608078 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.608103 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.608122 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:13Z","lastTransitionTime":"2025-11-26T22:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.711748 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.711805 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.711823 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.711847 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.711863 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:13Z","lastTransitionTime":"2025-11-26T22:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.814315 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.814350 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.814361 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.814378 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.814391 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:13Z","lastTransitionTime":"2025-11-26T22:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.917153 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.917231 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.917253 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.917356 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:13 crc kubenswrapper[5008]: I1126 22:40:13.917418 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:13Z","lastTransitionTime":"2025-11-26T22:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.020652 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.020722 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.020733 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.020757 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.020770 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:14Z","lastTransitionTime":"2025-11-26T22:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.124221 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.124467 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.124585 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.124693 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.124784 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:14Z","lastTransitionTime":"2025-11-26T22:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.228493 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.228570 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.228584 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.228610 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.228625 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:14Z","lastTransitionTime":"2025-11-26T22:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.332124 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.332190 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.332208 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.332235 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.332253 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:14Z","lastTransitionTime":"2025-11-26T22:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.435644 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.435696 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.435712 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.435734 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.435751 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:14Z","lastTransitionTime":"2025-11-26T22:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.517868 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.517869 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.517990 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:14 crc kubenswrapper[5008]: E1126 22:40:14.518122 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:14 crc kubenswrapper[5008]: E1126 22:40:14.518269 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:14 crc kubenswrapper[5008]: E1126 22:40:14.518511 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.538456 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.538506 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.538517 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.538539 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.538558 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:14Z","lastTransitionTime":"2025-11-26T22:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.603865 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.603916 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.603928 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.603945 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.603985 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:14Z","lastTransitionTime":"2025-11-26T22:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:14 crc kubenswrapper[5008]: E1126 22:40:14.623570 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T22:40:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T22:40:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"286285dc-f4e0-4a95-97ca-f92c5aacc002\\\",\\\"systemUUID\\\":\\\"50254066-f1e7-4e4d-8ba4-3174542eac6b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T22:40:14Z is after 2025-08-24T17:21:41Z" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.628550 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.628602 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.628614 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.628629 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.628640 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T22:40:14Z","lastTransitionTime":"2025-11-26T22:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.690004 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf"] Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.690572 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.692865 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.693128 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.693582 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.695262 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.717745 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/98ea8d9c-96ac-4bb1-b346-86df7be628ef-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.717834 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/98ea8d9c-96ac-4bb1-b346-86df7be628ef-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.717888 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98ea8d9c-96ac-4bb1-b346-86df7be628ef-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.717938 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98ea8d9c-96ac-4bb1-b346-86df7be628ef-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.717991 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/98ea8d9c-96ac-4bb1-b346-86df7be628ef-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.729858 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.729838304 podStartE2EDuration="1m8.729838304s" podCreationTimestamp="2025-11-26 22:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:40:14.728220975 +0000 UTC m=+90.140915037" watchObservedRunningTime="2025-11-26 22:40:14.729838304 +0000 UTC m=+90.142532346" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.746015 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=36.745996692 podStartE2EDuration="36.745996692s" podCreationTimestamp="2025-11-26 22:39:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:40:14.745611101 +0000 UTC m=+90.158305133" watchObservedRunningTime="2025-11-26 22:40:14.745996692 +0000 UTC m=+90.158690704" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.810580 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=23.810553534 podStartE2EDuration="23.810553534s" podCreationTimestamp="2025-11-26 22:39:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:40:14.794416156 +0000 UTC m=+90.207110208" watchObservedRunningTime="2025-11-26 22:40:14.810553534 +0000 UTC m=+90.223247566" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.819210 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98ea8d9c-96ac-4bb1-b346-86df7be628ef-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.819315 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98ea8d9c-96ac-4bb1-b346-86df7be628ef-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.819368 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/98ea8d9c-96ac-4bb1-b346-86df7be628ef-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.819491 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/98ea8d9c-96ac-4bb1-b346-86df7be628ef-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.819546 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/98ea8d9c-96ac-4bb1-b346-86df7be628ef-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.819620 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/98ea8d9c-96ac-4bb1-b346-86df7be628ef-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.819669 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/98ea8d9c-96ac-4bb1-b346-86df7be628ef-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.820427 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98ea8d9c-96ac-4bb1-b346-86df7be628ef-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.841554 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98ea8d9c-96ac-4bb1-b346-86df7be628ef-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.844605 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8x546" podStartSLOduration=67.844580912 podStartE2EDuration="1m7.844580912s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:40:14.826897857 +0000 UTC m=+90.239591859" watchObservedRunningTime="2025-11-26 22:40:14.844580912 +0000 UTC m=+90.257274924" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.844757 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cq57l" podStartSLOduration=67.844749567 podStartE2EDuration="1m7.844749567s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:40:14.842937732 +0000 UTC m=+90.255631744" watchObservedRunningTime="2025-11-26 22:40:14.844749567 +0000 UTC m=+90.257443599" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.852345 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/98ea8d9c-96ac-4bb1-b346-86df7be628ef-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mdhrf\" (UID: \"98ea8d9c-96ac-4bb1-b346-86df7be628ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.872816 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vwnrv" podStartSLOduration=67.872793265 podStartE2EDuration="1m7.872793265s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:40:14.859451482 +0000 UTC m=+90.272145534" watchObservedRunningTime="2025-11-26 22:40:14.872793265 +0000 UTC m=+90.285487277" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.893833 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.89380783 podStartE2EDuration="1m8.89380783s" podCreationTimestamp="2025-11-26 22:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:40:14.890257363 +0000 UTC m=+90.302951365" watchObservedRunningTime="2025-11-26 22:40:14.89380783 +0000 UTC m=+90.306501852" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.913128 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.913104503 podStartE2EDuration="1m8.913104503s" podCreationTimestamp="2025-11-26 22:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:40:14.91233599 +0000 UTC m=+90.325030002" watchObservedRunningTime="2025-11-26 22:40:14.913104503 +0000 UTC m=+90.325798525" Nov 26 22:40:14 crc kubenswrapper[5008]: I1126 22:40:14.944785 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ftgz4" podStartSLOduration=67.94476249 podStartE2EDuration="1m7.94476249s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:40:14.932334465 +0000 UTC m=+90.345028477" watchObservedRunningTime="2025-11-26 22:40:14.94476249 +0000 UTC m=+90.357456492" Nov 26 22:40:15 crc kubenswrapper[5008]: I1126 22:40:15.011558 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podStartSLOduration=68.011539678 podStartE2EDuration="1m8.011539678s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:40:14.984300205 +0000 UTC m=+90.396994217" watchObservedRunningTime="2025-11-26 22:40:15.011539678 +0000 UTC m=+90.424233690" Nov 26 22:40:15 crc kubenswrapper[5008]: I1126 22:40:15.013052 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" Nov 26 22:40:15 crc kubenswrapper[5008]: I1126 22:40:15.037125 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-r4xtd" podStartSLOduration=68.037107121 podStartE2EDuration="1m8.037107121s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:40:15.036175093 +0000 UTC m=+90.448869105" watchObservedRunningTime="2025-11-26 22:40:15.037107121 +0000 UTC m=+90.449801133" Nov 26 22:40:15 crc kubenswrapper[5008]: I1126 22:40:15.075164 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" event={"ID":"98ea8d9c-96ac-4bb1-b346-86df7be628ef","Type":"ContainerStarted","Data":"46d1c8fc5973ac08d445a2799c2be4395d022c1a8a9acbcfa69ce31161a39356"} Nov 26 22:40:15 crc kubenswrapper[5008]: I1126 22:40:15.517731 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:15 crc kubenswrapper[5008]: E1126 22:40:15.520073 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:16 crc kubenswrapper[5008]: I1126 22:40:16.080042 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" event={"ID":"98ea8d9c-96ac-4bb1-b346-86df7be628ef","Type":"ContainerStarted","Data":"e8b7e182565edb8214348304b0e8cf02de59e9a5fc3f78d30c039469f349c558"} Nov 26 22:40:16 crc kubenswrapper[5008]: I1126 22:40:16.101054 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdhrf" podStartSLOduration=69.101031188 podStartE2EDuration="1m9.101031188s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:40:16.100013897 +0000 UTC m=+91.512707959" watchObservedRunningTime="2025-11-26 22:40:16.101031188 +0000 UTC m=+91.513725230" Nov 26 22:40:16 crc kubenswrapper[5008]: I1126 22:40:16.517736 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:16 crc kubenswrapper[5008]: I1126 22:40:16.517784 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:16 crc kubenswrapper[5008]: I1126 22:40:16.517737 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:16 crc kubenswrapper[5008]: E1126 22:40:16.517864 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:16 crc kubenswrapper[5008]: E1126 22:40:16.518010 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:16 crc kubenswrapper[5008]: E1126 22:40:16.518101 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:17 crc kubenswrapper[5008]: I1126 22:40:17.517617 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:17 crc kubenswrapper[5008]: E1126 22:40:17.518222 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:18 crc kubenswrapper[5008]: I1126 22:40:18.518000 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:18 crc kubenswrapper[5008]: I1126 22:40:18.518059 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:18 crc kubenswrapper[5008]: I1126 22:40:18.518078 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:18 crc kubenswrapper[5008]: E1126 22:40:18.518199 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:18 crc kubenswrapper[5008]: E1126 22:40:18.518307 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:18 crc kubenswrapper[5008]: E1126 22:40:18.518478 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:19 crc kubenswrapper[5008]: I1126 22:40:19.518539 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:19 crc kubenswrapper[5008]: E1126 22:40:19.518735 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:20 crc kubenswrapper[5008]: I1126 22:40:20.518281 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:20 crc kubenswrapper[5008]: I1126 22:40:20.518339 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:20 crc kubenswrapper[5008]: I1126 22:40:20.518317 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:20 crc kubenswrapper[5008]: E1126 22:40:20.518471 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:20 crc kubenswrapper[5008]: E1126 22:40:20.518588 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:20 crc kubenswrapper[5008]: E1126 22:40:20.518698 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:21 crc kubenswrapper[5008]: I1126 22:40:21.517811 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:21 crc kubenswrapper[5008]: E1126 22:40:21.518073 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:21 crc kubenswrapper[5008]: I1126 22:40:21.519070 5008 scope.go:117] "RemoveContainer" containerID="902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45" Nov 26 22:40:21 crc kubenswrapper[5008]: E1126 22:40:21.519308 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" Nov 26 22:40:22 crc kubenswrapper[5008]: I1126 22:40:22.518299 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:22 crc kubenswrapper[5008]: I1126 22:40:22.518386 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:22 crc kubenswrapper[5008]: E1126 22:40:22.518520 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:22 crc kubenswrapper[5008]: I1126 22:40:22.518561 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:22 crc kubenswrapper[5008]: E1126 22:40:22.518712 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:22 crc kubenswrapper[5008]: E1126 22:40:22.518881 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:23 crc kubenswrapper[5008]: I1126 22:40:23.518462 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:23 crc kubenswrapper[5008]: E1126 22:40:23.518717 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:24 crc kubenswrapper[5008]: I1126 22:40:24.518440 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:24 crc kubenswrapper[5008]: I1126 22:40:24.518470 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:24 crc kubenswrapper[5008]: E1126 22:40:24.518687 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:24 crc kubenswrapper[5008]: E1126 22:40:24.518754 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:24 crc kubenswrapper[5008]: I1126 22:40:24.518480 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:24 crc kubenswrapper[5008]: E1126 22:40:24.519153 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:25 crc kubenswrapper[5008]: I1126 22:40:25.517857 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:25 crc kubenswrapper[5008]: E1126 22:40:25.519615 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:25 crc kubenswrapper[5008]: I1126 22:40:25.848362 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:25 crc kubenswrapper[5008]: E1126 22:40:25.848595 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:40:25 crc kubenswrapper[5008]: E1126 22:40:25.848730 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs podName:7adf9a69-5de6-4710-b394-968387df9ae6 nodeName:}" failed. No retries permitted until 2025-11-26 22:41:29.848698432 +0000 UTC m=+165.261392484 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs") pod "network-metrics-daemon-xplkg" (UID: "7adf9a69-5de6-4710-b394-968387df9ae6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 22:40:26 crc kubenswrapper[5008]: I1126 22:40:26.517329 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:26 crc kubenswrapper[5008]: I1126 22:40:26.517454 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:26 crc kubenswrapper[5008]: I1126 22:40:26.517333 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:26 crc kubenswrapper[5008]: E1126 22:40:26.517511 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:26 crc kubenswrapper[5008]: E1126 22:40:26.517678 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:26 crc kubenswrapper[5008]: E1126 22:40:26.517936 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:27 crc kubenswrapper[5008]: I1126 22:40:27.518302 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:27 crc kubenswrapper[5008]: E1126 22:40:27.518509 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:28 crc kubenswrapper[5008]: I1126 22:40:28.517367 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:28 crc kubenswrapper[5008]: I1126 22:40:28.517395 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:28 crc kubenswrapper[5008]: I1126 22:40:28.517489 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:28 crc kubenswrapper[5008]: E1126 22:40:28.517656 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:28 crc kubenswrapper[5008]: E1126 22:40:28.517801 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:28 crc kubenswrapper[5008]: E1126 22:40:28.518011 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:29 crc kubenswrapper[5008]: I1126 22:40:29.517600 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:29 crc kubenswrapper[5008]: E1126 22:40:29.517824 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:30 crc kubenswrapper[5008]: I1126 22:40:30.518066 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:30 crc kubenswrapper[5008]: I1126 22:40:30.518075 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:30 crc kubenswrapper[5008]: I1126 22:40:30.518099 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:30 crc kubenswrapper[5008]: E1126 22:40:30.518566 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:30 crc kubenswrapper[5008]: E1126 22:40:30.518940 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:30 crc kubenswrapper[5008]: E1126 22:40:30.519051 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:31 crc kubenswrapper[5008]: I1126 22:40:31.518359 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:31 crc kubenswrapper[5008]: E1126 22:40:31.519316 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:32 crc kubenswrapper[5008]: I1126 22:40:32.518156 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:32 crc kubenswrapper[5008]: I1126 22:40:32.518477 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:32 crc kubenswrapper[5008]: I1126 22:40:32.518515 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:32 crc kubenswrapper[5008]: E1126 22:40:32.518713 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:32 crc kubenswrapper[5008]: E1126 22:40:32.519036 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:32 crc kubenswrapper[5008]: E1126 22:40:32.519742 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:32 crc kubenswrapper[5008]: I1126 22:40:32.520234 5008 scope.go:117] "RemoveContainer" containerID="902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45" Nov 26 22:40:32 crc kubenswrapper[5008]: E1126 22:40:32.520530 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" Nov 26 22:40:33 crc kubenswrapper[5008]: I1126 22:40:33.518014 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:33 crc kubenswrapper[5008]: E1126 22:40:33.518212 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:34 crc kubenswrapper[5008]: I1126 22:40:34.517825 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:34 crc kubenswrapper[5008]: I1126 22:40:34.517889 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:34 crc kubenswrapper[5008]: I1126 22:40:34.517832 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:34 crc kubenswrapper[5008]: E1126 22:40:34.517998 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:34 crc kubenswrapper[5008]: E1126 22:40:34.518111 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:34 crc kubenswrapper[5008]: E1126 22:40:34.518189 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:35 crc kubenswrapper[5008]: I1126 22:40:35.520287 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:35 crc kubenswrapper[5008]: E1126 22:40:35.520723 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:36 crc kubenswrapper[5008]: I1126 22:40:36.517881 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:36 crc kubenswrapper[5008]: I1126 22:40:36.517941 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:36 crc kubenswrapper[5008]: I1126 22:40:36.518069 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:36 crc kubenswrapper[5008]: E1126 22:40:36.518108 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:36 crc kubenswrapper[5008]: E1126 22:40:36.518257 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:36 crc kubenswrapper[5008]: E1126 22:40:36.518641 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:37 crc kubenswrapper[5008]: I1126 22:40:37.517746 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:37 crc kubenswrapper[5008]: E1126 22:40:37.519025 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:38 crc kubenswrapper[5008]: I1126 22:40:38.517712 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:38 crc kubenswrapper[5008]: E1126 22:40:38.517838 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:38 crc kubenswrapper[5008]: I1126 22:40:38.518520 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:38 crc kubenswrapper[5008]: I1126 22:40:38.518609 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:38 crc kubenswrapper[5008]: E1126 22:40:38.518871 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:38 crc kubenswrapper[5008]: E1126 22:40:38.518916 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:39 crc kubenswrapper[5008]: I1126 22:40:39.518253 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:39 crc kubenswrapper[5008]: E1126 22:40:39.518510 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:40 crc kubenswrapper[5008]: I1126 22:40:40.518089 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:40 crc kubenswrapper[5008]: I1126 22:40:40.518148 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:40 crc kubenswrapper[5008]: I1126 22:40:40.518187 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:40 crc kubenswrapper[5008]: E1126 22:40:40.518336 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:40 crc kubenswrapper[5008]: E1126 22:40:40.518450 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:40 crc kubenswrapper[5008]: E1126 22:40:40.518570 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:41 crc kubenswrapper[5008]: I1126 22:40:41.517679 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:41 crc kubenswrapper[5008]: E1126 22:40:41.517928 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:42 crc kubenswrapper[5008]: I1126 22:40:42.518246 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:42 crc kubenswrapper[5008]: I1126 22:40:42.518299 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:42 crc kubenswrapper[5008]: I1126 22:40:42.518367 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:42 crc kubenswrapper[5008]: E1126 22:40:42.518556 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:42 crc kubenswrapper[5008]: E1126 22:40:42.518605 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:42 crc kubenswrapper[5008]: E1126 22:40:42.518664 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:43 crc kubenswrapper[5008]: I1126 22:40:43.177096 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4xtd_8509b0e0-c914-44a1-a657-ffb4f5a86c18/kube-multus/1.log" Nov 26 22:40:43 crc kubenswrapper[5008]: I1126 22:40:43.177846 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4xtd_8509b0e0-c914-44a1-a657-ffb4f5a86c18/kube-multus/0.log" Nov 26 22:40:43 crc kubenswrapper[5008]: I1126 22:40:43.178014 5008 generic.go:334] "Generic (PLEG): container finished" podID="8509b0e0-c914-44a1-a657-ffb4f5a86c18" containerID="7771b2db4cb155a3eaf8d9c456340e0c2fdae6e830412dfb95a69482a8fd39fa" exitCode=1 Nov 26 22:40:43 crc kubenswrapper[5008]: I1126 22:40:43.178057 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4xtd" event={"ID":"8509b0e0-c914-44a1-a657-ffb4f5a86c18","Type":"ContainerDied","Data":"7771b2db4cb155a3eaf8d9c456340e0c2fdae6e830412dfb95a69482a8fd39fa"} Nov 26 22:40:43 crc kubenswrapper[5008]: I1126 22:40:43.178290 5008 scope.go:117] "RemoveContainer" containerID="56921dbf83cc70dd8d29c210f3c6efade2b402db6da5ecc38b345fe8c64b7625" Nov 26 22:40:43 crc kubenswrapper[5008]: I1126 22:40:43.179458 5008 scope.go:117] "RemoveContainer" containerID="7771b2db4cb155a3eaf8d9c456340e0c2fdae6e830412dfb95a69482a8fd39fa" Nov 26 22:40:43 crc kubenswrapper[5008]: E1126 22:40:43.179791 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-r4xtd_openshift-multus(8509b0e0-c914-44a1-a657-ffb4f5a86c18)\"" pod="openshift-multus/multus-r4xtd" podUID="8509b0e0-c914-44a1-a657-ffb4f5a86c18" Nov 26 22:40:43 crc kubenswrapper[5008]: I1126 22:40:43.518300 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:43 crc kubenswrapper[5008]: E1126 22:40:43.519384 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:44 crc kubenswrapper[5008]: I1126 22:40:44.184291 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4xtd_8509b0e0-c914-44a1-a657-ffb4f5a86c18/kube-multus/1.log" Nov 26 22:40:44 crc kubenswrapper[5008]: I1126 22:40:44.517545 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:44 crc kubenswrapper[5008]: I1126 22:40:44.517565 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:44 crc kubenswrapper[5008]: I1126 22:40:44.517577 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:44 crc kubenswrapper[5008]: E1126 22:40:44.517756 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:44 crc kubenswrapper[5008]: E1126 22:40:44.518289 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:44 crc kubenswrapper[5008]: E1126 22:40:44.518389 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:44 crc kubenswrapper[5008]: I1126 22:40:44.518872 5008 scope.go:117] "RemoveContainer" containerID="902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45" Nov 26 22:40:44 crc kubenswrapper[5008]: E1126 22:40:44.519172 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zpbmz_openshift-ovn-kubernetes(41e5d1a8-86e0-42e2-a446-8f8938091dc1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" Nov 26 22:40:45 crc kubenswrapper[5008]: I1126 22:40:45.519009 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:45 crc kubenswrapper[5008]: E1126 22:40:45.521210 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:45 crc kubenswrapper[5008]: E1126 22:40:45.530063 5008 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 26 22:40:45 crc kubenswrapper[5008]: E1126 22:40:45.612276 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 26 22:40:46 crc kubenswrapper[5008]: I1126 22:40:46.518244 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:46 crc kubenswrapper[5008]: E1126 22:40:46.518387 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:46 crc kubenswrapper[5008]: I1126 22:40:46.518237 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:46 crc kubenswrapper[5008]: I1126 22:40:46.518237 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:46 crc kubenswrapper[5008]: E1126 22:40:46.518460 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:46 crc kubenswrapper[5008]: E1126 22:40:46.518580 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:47 crc kubenswrapper[5008]: I1126 22:40:47.517630 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:47 crc kubenswrapper[5008]: E1126 22:40:47.517839 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:48 crc kubenswrapper[5008]: I1126 22:40:48.517982 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:48 crc kubenswrapper[5008]: I1126 22:40:48.518030 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:48 crc kubenswrapper[5008]: I1126 22:40:48.518058 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:48 crc kubenswrapper[5008]: E1126 22:40:48.518106 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:48 crc kubenswrapper[5008]: E1126 22:40:48.518212 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:48 crc kubenswrapper[5008]: E1126 22:40:48.518321 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:49 crc kubenswrapper[5008]: I1126 22:40:49.517935 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:49 crc kubenswrapper[5008]: E1126 22:40:49.518554 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:50 crc kubenswrapper[5008]: I1126 22:40:50.517795 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:50 crc kubenswrapper[5008]: I1126 22:40:50.517796 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:50 crc kubenswrapper[5008]: I1126 22:40:50.517938 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:50 crc kubenswrapper[5008]: E1126 22:40:50.518023 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:50 crc kubenswrapper[5008]: E1126 22:40:50.518160 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:50 crc kubenswrapper[5008]: E1126 22:40:50.518351 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:50 crc kubenswrapper[5008]: E1126 22:40:50.614144 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 26 22:40:51 crc kubenswrapper[5008]: I1126 22:40:51.518334 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:51 crc kubenswrapper[5008]: E1126 22:40:51.518534 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:52 crc kubenswrapper[5008]: I1126 22:40:52.518428 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:52 crc kubenswrapper[5008]: I1126 22:40:52.518429 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:52 crc kubenswrapper[5008]: E1126 22:40:52.518606 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:52 crc kubenswrapper[5008]: I1126 22:40:52.518455 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:52 crc kubenswrapper[5008]: E1126 22:40:52.518897 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:52 crc kubenswrapper[5008]: E1126 22:40:52.518947 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:53 crc kubenswrapper[5008]: I1126 22:40:53.518402 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:53 crc kubenswrapper[5008]: E1126 22:40:53.519150 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:54 crc kubenswrapper[5008]: I1126 22:40:54.518418 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:54 crc kubenswrapper[5008]: E1126 22:40:54.518580 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:54 crc kubenswrapper[5008]: I1126 22:40:54.518829 5008 scope.go:117] "RemoveContainer" containerID="7771b2db4cb155a3eaf8d9c456340e0c2fdae6e830412dfb95a69482a8fd39fa" Nov 26 22:40:54 crc kubenswrapper[5008]: I1126 22:40:54.518889 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:54 crc kubenswrapper[5008]: E1126 22:40:54.519010 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:54 crc kubenswrapper[5008]: I1126 22:40:54.519224 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:54 crc kubenswrapper[5008]: E1126 22:40:54.519347 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:55 crc kubenswrapper[5008]: I1126 22:40:55.292442 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4xtd_8509b0e0-c914-44a1-a657-ffb4f5a86c18/kube-multus/1.log" Nov 26 22:40:55 crc kubenswrapper[5008]: I1126 22:40:55.292493 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4xtd" event={"ID":"8509b0e0-c914-44a1-a657-ffb4f5a86c18","Type":"ContainerStarted","Data":"0d9e736c1f4ff40e4b3fa98fec6877c0380a481efbf6e05b4d566b33c9d31ba7"} Nov 26 22:40:55 crc kubenswrapper[5008]: I1126 22:40:55.517455 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:55 crc kubenswrapper[5008]: E1126 22:40:55.518490 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:55 crc kubenswrapper[5008]: E1126 22:40:55.615020 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 26 22:40:56 crc kubenswrapper[5008]: I1126 22:40:56.517523 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:56 crc kubenswrapper[5008]: I1126 22:40:56.517628 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:56 crc kubenswrapper[5008]: E1126 22:40:56.517697 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:56 crc kubenswrapper[5008]: E1126 22:40:56.517824 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:56 crc kubenswrapper[5008]: I1126 22:40:56.517902 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:56 crc kubenswrapper[5008]: E1126 22:40:56.518041 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:57 crc kubenswrapper[5008]: I1126 22:40:57.517536 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:57 crc kubenswrapper[5008]: E1126 22:40:57.517779 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:40:58 crc kubenswrapper[5008]: I1126 22:40:58.518092 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:40:58 crc kubenswrapper[5008]: I1126 22:40:58.518123 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:40:58 crc kubenswrapper[5008]: I1126 22:40:58.518176 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:40:58 crc kubenswrapper[5008]: E1126 22:40:58.518399 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:40:58 crc kubenswrapper[5008]: E1126 22:40:58.518541 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:40:58 crc kubenswrapper[5008]: E1126 22:40:58.518680 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:40:58 crc kubenswrapper[5008]: I1126 22:40:58.519729 5008 scope.go:117] "RemoveContainer" containerID="902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45" Nov 26 22:40:59 crc kubenswrapper[5008]: I1126 22:40:59.309232 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/3.log" Nov 26 22:40:59 crc kubenswrapper[5008]: I1126 22:40:59.312429 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerStarted","Data":"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710"} Nov 26 22:40:59 crc kubenswrapper[5008]: I1126 22:40:59.312856 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:40:59 crc kubenswrapper[5008]: I1126 22:40:59.341326 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podStartSLOduration=112.341302515 podStartE2EDuration="1m52.341302515s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:40:59.339464783 +0000 UTC m=+134.752158815" watchObservedRunningTime="2025-11-26 22:40:59.341302515 +0000 UTC m=+134.753996557" Nov 26 22:40:59 crc kubenswrapper[5008]: I1126 22:40:59.426379 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xplkg"] Nov 26 22:40:59 crc kubenswrapper[5008]: I1126 22:40:59.426516 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:40:59 crc kubenswrapper[5008]: E1126 22:40:59.426661 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:41:00 crc kubenswrapper[5008]: I1126 22:41:00.518201 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:41:00 crc kubenswrapper[5008]: I1126 22:41:00.518228 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:41:00 crc kubenswrapper[5008]: E1126 22:41:00.518382 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:41:00 crc kubenswrapper[5008]: I1126 22:41:00.518228 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:41:00 crc kubenswrapper[5008]: E1126 22:41:00.518447 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:41:00 crc kubenswrapper[5008]: E1126 22:41:00.518492 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:41:00 crc kubenswrapper[5008]: E1126 22:41:00.616727 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 26 22:41:01 crc kubenswrapper[5008]: I1126 22:41:01.518485 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:41:01 crc kubenswrapper[5008]: E1126 22:41:01.518727 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:41:02 crc kubenswrapper[5008]: I1126 22:41:02.518235 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:41:02 crc kubenswrapper[5008]: I1126 22:41:02.518247 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:41:02 crc kubenswrapper[5008]: I1126 22:41:02.518247 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:41:02 crc kubenswrapper[5008]: E1126 22:41:02.518430 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:41:02 crc kubenswrapper[5008]: E1126 22:41:02.518530 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:41:02 crc kubenswrapper[5008]: E1126 22:41:02.518620 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:41:03 crc kubenswrapper[5008]: I1126 22:41:03.518325 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:41:03 crc kubenswrapper[5008]: E1126 22:41:03.518485 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:41:04 crc kubenswrapper[5008]: I1126 22:41:04.518009 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:41:04 crc kubenswrapper[5008]: I1126 22:41:04.518027 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:41:04 crc kubenswrapper[5008]: E1126 22:41:04.518592 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 22:41:04 crc kubenswrapper[5008]: I1126 22:41:04.518053 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:41:04 crc kubenswrapper[5008]: E1126 22:41:04.518473 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 22:41:04 crc kubenswrapper[5008]: E1126 22:41:04.518910 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 22:41:05 crc kubenswrapper[5008]: I1126 22:41:05.518255 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:41:05 crc kubenswrapper[5008]: E1126 22:41:05.520213 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xplkg" podUID="7adf9a69-5de6-4710-b394-968387df9ae6" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.172081 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.227868 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-w5vzs"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.229183 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.229226 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.230266 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.231071 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7rb5v"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.232364 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xsdh6"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.233063 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.233078 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.233783 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.234056 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.234426 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.234830 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j5n5x"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.235088 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.235193 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.239019 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-72g4c"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.239543 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.239866 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.242838 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.246745 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.247279 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.248135 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.248784 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.250571 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k77kt"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.251162 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.251283 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.255865 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-65m9h"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.256354 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.256668 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.256688 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.256823 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.257234 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-t29mr"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.257923 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t29mr" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.259505 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qdkcv"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.260246 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.260889 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.261358 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.261375 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xf97j"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.268637 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.278547 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.279513 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.279741 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.279921 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.280097 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.280231 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.280345 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dzzbw"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.280872 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.280360 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.280480 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.282094 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.281625 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.282224 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.281711 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.281810 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.282494 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.286399 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.286622 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.286823 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.287026 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.287136 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.287219 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.287321 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.287475 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.287579 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.287732 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.287937 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.288463 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.295925 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.296462 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m4q9d"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.297459 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.313474 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.314654 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.314879 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.315052 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.315219 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.315327 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.315469 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.315615 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.315727 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.315823 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.315914 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.316067 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.316227 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.316350 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.316461 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.316646 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.316744 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.316844 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.317627 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.317840 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.318028 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.318181 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.318340 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.318513 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.318625 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.318823 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.318979 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.319191 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.324814 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.325187 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.325252 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.325409 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.325929 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.326028 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.326270 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.326403 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.326499 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.326573 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.326667 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.326749 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.326824 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.326834 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.326981 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.327083 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.327325 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.327461 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.328244 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.328378 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.328561 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.328682 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.328814 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.328994 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.329108 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.329144 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.329110 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.329287 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.329358 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.329502 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.329549 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.329610 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.330096 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.330356 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.330441 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.330768 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.330816 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.331081 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.331306 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-m4q9d" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.331387 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.331312 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-w5vzs"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.331587 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.331625 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.331807 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.332621 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.332863 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.337096 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d2869e8-cbd3-43a7-b374-bf97ca15d5df-serving-cert\") pod \"openshift-config-operator-7777fb866f-72g4c\" (UID: \"1d2869e8-cbd3-43a7-b374-bf97ca15d5df\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.337196 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llxf9\" (UniqueName: \"kubernetes.io/projected/1d2869e8-cbd3-43a7-b374-bf97ca15d5df-kube-api-access-llxf9\") pod \"openshift-config-operator-7777fb866f-72g4c\" (UID: \"1d2869e8-cbd3-43a7-b374-bf97ca15d5df\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.337235 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1d2869e8-cbd3-43a7-b374-bf97ca15d5df-available-featuregates\") pod \"openshift-config-operator-7777fb866f-72g4c\" (UID: \"1d2869e8-cbd3-43a7-b374-bf97ca15d5df\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.337561 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.338171 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-qhtsg"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.338532 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.338532 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.347390 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.375593 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.375878 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.376185 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.377089 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.378328 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.381749 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.398467 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.398908 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.399955 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.400500 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.400557 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xsdh6"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.400659 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.400775 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.401748 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.403030 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-72g4c"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.403574 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.404285 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.404541 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s2w2s"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.405552 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7rb5v"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.405660 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s2w2s" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.407160 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.407498 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.407995 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.409115 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.409528 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.411836 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.412571 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.413182 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.413930 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-47sdz"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.415231 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-47sdz" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.415734 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.416395 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-drtxl"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.417306 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.417347 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rwq8l"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.417993 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.418360 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.421988 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4d4wh"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.422555 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4d4wh" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.422956 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6x274"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.423476 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.423866 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.424270 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.426012 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-w58c8"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.426524 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w58c8" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.427600 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.428217 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.429181 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.430236 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.431049 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.431837 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t29mr"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.433410 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.434889 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-65m9h"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.437892 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llxf9\" (UniqueName: \"kubernetes.io/projected/1d2869e8-cbd3-43a7-b374-bf97ca15d5df-kube-api-access-llxf9\") pod \"openshift-config-operator-7777fb866f-72g4c\" (UID: \"1d2869e8-cbd3-43a7-b374-bf97ca15d5df\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.437929 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1d2869e8-cbd3-43a7-b374-bf97ca15d5df-available-featuregates\") pod \"openshift-config-operator-7777fb866f-72g4c\" (UID: \"1d2869e8-cbd3-43a7-b374-bf97ca15d5df\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.437983 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d2869e8-cbd3-43a7-b374-bf97ca15d5df-serving-cert\") pod \"openshift-config-operator-7777fb866f-72g4c\" (UID: \"1d2869e8-cbd3-43a7-b374-bf97ca15d5df\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.438630 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1d2869e8-cbd3-43a7-b374-bf97ca15d5df-available-featuregates\") pod \"openshift-config-operator-7777fb866f-72g4c\" (UID: \"1d2869e8-cbd3-43a7-b374-bf97ca15d5df\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.438624 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.439848 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.443348 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.443884 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d2869e8-cbd3-43a7-b374-bf97ca15d5df-serving-cert\") pod \"openshift-config-operator-7777fb866f-72g4c\" (UID: \"1d2869e8-cbd3-43a7-b374-bf97ca15d5df\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.446541 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s2w2s"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.447496 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dzzbw"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.448900 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m4q9d"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.451906 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.452368 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.455204 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j5n5x"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.456386 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k77kt"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.458085 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xf97j"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.458936 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.460344 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.460594 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.461665 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rwq8l"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.463411 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.470670 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qdkcv"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.471287 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.473227 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.475865 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nzkzs"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.477462 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nzkzs" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.477842 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lbcxn"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.478871 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-drtxl"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.478953 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.479805 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6x274"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.482885 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.484260 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nzkzs"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.485684 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.487243 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w58c8"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.488622 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lbcxn"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.489878 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.491320 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.493449 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4d4wh"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.496025 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.497721 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj"] Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.511711 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.517803 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.517814 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.517949 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.531592 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.551271 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.611156 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.631233 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.651379 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.672122 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.699681 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.711600 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.731714 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.751587 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.771858 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.792457 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.812514 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.831308 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.851500 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.872510 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.891611 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.911680 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.932004 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.951920 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.971219 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 26 22:41:06 crc kubenswrapper[5008]: I1126 22:41:06.992230 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.011738 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.031894 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.051891 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.072421 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.092746 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.112663 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.131375 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.152859 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.172280 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.193222 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.211590 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.232480 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.253119 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.272954 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.292325 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.312540 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.331232 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.351956 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.372331 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.392657 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.409610 5008 request.go:700] Waited for 1.003765163s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-sa-dockercfg-5xfcg&limit=500&resourceVersion=0 Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.411004 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.433155 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.451777 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.472657 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.492216 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.512578 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.517762 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.531944 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.552371 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.570909 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.591777 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.612371 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.632434 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.651635 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.671767 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.692422 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.711415 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.731948 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.751945 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.774661 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.791846 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.811952 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.831894 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 26 22:41:07 crc kubenswrapper[5008]: I1126 22:41:07.851003 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.293344 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.293521 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.293700 5008 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.293873 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.293979 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.294102 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.294183 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.294232 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.294309 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.294405 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.294446 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.294487 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.296436 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.297101 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.297260 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.297425 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.297546 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.297648 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.297863 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.298028 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.298249 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.311280 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.312795 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llxf9\" (UniqueName: \"kubernetes.io/projected/1d2869e8-cbd3-43a7-b374-bf97ca15d5df-kube-api-access-llxf9\") pod \"openshift-config-operator-7777fb866f-72g4c\" (UID: \"1d2869e8-cbd3-43a7-b374-bf97ca15d5df\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.331476 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.351295 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.411940 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.435901 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462065 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/520338ff-81f2-44c4-9c0f-0d73bd42984c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pwq7l\" (UID: \"520338ff-81f2-44c4-9c0f-0d73bd42984c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462112 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bl7t\" (UniqueName: \"kubernetes.io/projected/e92e5871-b651-4219-ab37-6c973ea7fc92-kube-api-access-5bl7t\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462131 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c61407a-51aa-45ed-b80c-0376f72f01a8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-65x7g\" (UID: \"1c61407a-51aa-45ed-b80c-0376f72f01a8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462152 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ac7305-8c6d-4fb3-900f-7c64bf930d38-config\") pod \"machine-approver-56656f9798-vcl6w\" (UID: \"f9ac7305-8c6d-4fb3-900f-7c64bf930d38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462167 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgzt9\" (UniqueName: \"kubernetes.io/projected/f9ac7305-8c6d-4fb3-900f-7c64bf930d38-kube-api-access-cgzt9\") pod \"machine-approver-56656f9798-vcl6w\" (UID: \"f9ac7305-8c6d-4fb3-900f-7c64bf930d38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462181 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3e07175e-42fa-4065-adb1-3469e75ea4d8-console-config\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462205 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/520338ff-81f2-44c4-9c0f-0d73bd42984c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pwq7l\" (UID: \"520338ff-81f2-44c4-9c0f-0d73bd42984c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462251 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607a3fbf-3586-4c55-894a-f9107fc5679d-serving-cert\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462268 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462286 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e92e5871-b651-4219-ab37-6c973ea7fc92-encryption-config\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462301 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/90717c11-47b1-4265-8ea3-9c826850e812-registry-certificates\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462314 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfrqw\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-kube-api-access-cfrqw\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462329 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e92e5871-b651-4219-ab37-6c973ea7fc92-etcd-client\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462351 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2e26305-8405-49e7-a5f6-611f769851d6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rz4wr\" (UID: \"e2e26305-8405-49e7-a5f6-611f769851d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462367 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/90717c11-47b1-4265-8ea3-9c826850e812-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462382 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grls8\" (UniqueName: \"kubernetes.io/projected/1c61407a-51aa-45ed-b80c-0376f72f01a8-kube-api-access-grls8\") pod \"cluster-image-registry-operator-dc59b4c8b-65x7g\" (UID: \"1c61407a-51aa-45ed-b80c-0376f72f01a8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462397 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-client-ca\") pod \"route-controller-manager-6576b87f9c-9qz6q\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462412 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462426 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-config\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462441 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/17969c13-4b9e-4a72-b81f-7160db060271-etcd-service-ca\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462455 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/397c9d40-0280-4491-8aa1-2f97f28d0b9e-encryption-config\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462471 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d57b4c5c-1f9c-4317-8912-dfc20269e1af-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462489 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/17969c13-4b9e-4a72-b81f-7160db060271-etcd-client\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462505 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxg8f\" (UniqueName: \"kubernetes.io/projected/397c9d40-0280-4491-8aa1-2f97f28d0b9e-kube-api-access-hxg8f\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462521 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-bound-sa-token\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462535 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-serving-cert\") pod \"route-controller-manager-6576b87f9c-9qz6q\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462560 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e92e5871-b651-4219-ab37-6c973ea7fc92-audit-policies\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462574 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/041c5cf5-31c5-436f-b5c2-847ee3f3708c-metrics-tls\") pod \"dns-operator-744455d44c-m4q9d\" (UID: \"041c5cf5-31c5-436f-b5c2-847ee3f3708c\") " pod="openshift-dns-operator/dns-operator-744455d44c-m4q9d" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462588 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq8sd\" (UniqueName: \"kubernetes.io/projected/17969c13-4b9e-4a72-b81f-7160db060271-kube-api-access-lq8sd\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462601 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3e07175e-42fa-4065-adb1-3469e75ea4d8-console-oauth-config\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462615 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3e07175e-42fa-4065-adb1-3469e75ea4d8-service-ca\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462635 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c02a7188-7359-43e4-aed9-dd67d2b42875-images\") pod \"machine-api-operator-5694c8668f-7rb5v\" (UID: \"c02a7188-7359-43e4-aed9-dd67d2b42875\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462651 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c02a7188-7359-43e4-aed9-dd67d2b42875-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7rb5v\" (UID: \"c02a7188-7359-43e4-aed9-dd67d2b42875\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462666 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462680 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e92e5871-b651-4219-ab37-6c973ea7fc92-serving-cert\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462695 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rbfd\" (UniqueName: \"kubernetes.io/projected/743cf3d7-7296-4b87-94e3-67fc66dacca7-kube-api-access-6rbfd\") pod \"cluster-samples-operator-665b6dd947-p5xvf\" (UID: \"743cf3d7-7296-4b87-94e3-67fc66dacca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462712 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e07175e-42fa-4065-adb1-3469e75ea4d8-trusted-ca-bundle\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462727 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/743cf3d7-7296-4b87-94e3-67fc66dacca7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p5xvf\" (UID: \"743cf3d7-7296-4b87-94e3-67fc66dacca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462741 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj4md\" (UniqueName: \"kubernetes.io/projected/e2e26305-8405-49e7-a5f6-611f769851d6-kube-api-access-gj4md\") pod \"openshift-apiserver-operator-796bbdcf4f-rz4wr\" (UID: \"e2e26305-8405-49e7-a5f6-611f769851d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462755 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d57b4c5c-1f9c-4317-8912-dfc20269e1af-serving-cert\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462770 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/397c9d40-0280-4491-8aa1-2f97f28d0b9e-etcd-client\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462784 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90717c11-47b1-4265-8ea3-9c826850e812-trusted-ca\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462799 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f9ac7305-8c6d-4fb3-900f-7c64bf930d38-auth-proxy-config\") pod \"machine-approver-56656f9798-vcl6w\" (UID: \"f9ac7305-8c6d-4fb3-900f-7c64bf930d38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462814 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c61407a-51aa-45ed-b80c-0376f72f01a8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-65x7g\" (UID: \"1c61407a-51aa-45ed-b80c-0376f72f01a8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462827 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-etcd-serving-ca\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.462845 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-registry-tls\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463056 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrx9f\" (UniqueName: \"kubernetes.io/projected/6f72cb1c-2994-4ed4-8333-b7498c1615bf-kube-api-access-wrx9f\") pod \"downloads-7954f5f757-t29mr\" (UID: \"6f72cb1c-2994-4ed4-8333-b7498c1615bf\") " pod="openshift-console/downloads-7954f5f757-t29mr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463082 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whghj\" (UniqueName: \"kubernetes.io/projected/520338ff-81f2-44c4-9c0f-0d73bd42984c-kube-api-access-whghj\") pod \"openshift-controller-manager-operator-756b6f6bc6-pwq7l\" (UID: \"520338ff-81f2-44c4-9c0f-0d73bd42984c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463102 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np7b2\" (UniqueName: \"kubernetes.io/projected/607a3fbf-3586-4c55-894a-f9107fc5679d-kube-api-access-np7b2\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463131 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463152 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463174 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-audit\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463191 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c02a7188-7359-43e4-aed9-dd67d2b42875-config\") pod \"machine-api-operator-5694c8668f-7rb5v\" (UID: \"c02a7188-7359-43e4-aed9-dd67d2b42875\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463204 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463218 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n2sb\" (UniqueName: \"kubernetes.io/projected/c02a7188-7359-43e4-aed9-dd67d2b42875-kube-api-access-9n2sb\") pod \"machine-api-operator-5694c8668f-7rb5v\" (UID: \"c02a7188-7359-43e4-aed9-dd67d2b42875\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463231 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-config\") pod \"route-controller-manager-6576b87f9c-9qz6q\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463244 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463260 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c61407a-51aa-45ed-b80c-0376f72f01a8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-65x7g\" (UID: \"1c61407a-51aa-45ed-b80c-0376f72f01a8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463275 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54wwj\" (UniqueName: \"kubernetes.io/projected/4f56002e-5cd7-4eb0-9228-88e40e9a9942-kube-api-access-54wwj\") pod \"console-operator-58897d9998-qdkcv\" (UID: \"4f56002e-5cd7-4eb0-9228-88e40e9a9942\") " pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463292 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2e26305-8405-49e7-a5f6-611f769851d6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rz4wr\" (UID: \"e2e26305-8405-49e7-a5f6-611f769851d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463311 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f56002e-5cd7-4eb0-9228-88e40e9a9942-config\") pod \"console-operator-58897d9998-qdkcv\" (UID: \"4f56002e-5cd7-4eb0-9228-88e40e9a9942\") " pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463331 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f56002e-5cd7-4eb0-9228-88e40e9a9942-serving-cert\") pod \"console-operator-58897d9998-qdkcv\" (UID: \"4f56002e-5cd7-4eb0-9228-88e40e9a9942\") " pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463348 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-image-import-ca\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463367 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/397c9d40-0280-4491-8aa1-2f97f28d0b9e-node-pullsecrets\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463387 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/90717c11-47b1-4265-8ea3-9c826850e812-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463406 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e92e5871-b651-4219-ab37-6c973ea7fc92-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463426 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463444 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17969c13-4b9e-4a72-b81f-7160db060271-config\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463479 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e07175e-42fa-4065-adb1-3469e75ea4d8-console-serving-cert\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463500 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e92e5871-b651-4219-ab37-6c973ea7fc92-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463519 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57b4c5c-1f9c-4317-8912-dfc20269e1af-config\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463539 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463561 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/17969c13-4b9e-4a72-b81f-7160db060271-etcd-ca\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463581 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-config\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463612 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3e07175e-42fa-4065-adb1-3469e75ea4d8-oauth-serving-cert\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463631 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e92e5871-b651-4219-ab37-6c973ea7fc92-audit-dir\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463649 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/397c9d40-0280-4491-8aa1-2f97f28d0b9e-serving-cert\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463670 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-audit-policies\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463694 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f56002e-5cd7-4eb0-9228-88e40e9a9942-trusted-ca\") pod \"console-operator-58897d9998-qdkcv\" (UID: \"4f56002e-5cd7-4eb0-9228-88e40e9a9942\") " pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463716 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p22sq\" (UniqueName: \"kubernetes.io/projected/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-kube-api-access-p22sq\") pod \"route-controller-manager-6576b87f9c-9qz6q\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463737 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463756 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8z8d\" (UniqueName: \"kubernetes.io/projected/1cf78e11-8113-4dd4-9b76-182452887bf3-kube-api-access-c8z8d\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463776 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463795 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17969c13-4b9e-4a72-b81f-7160db060271-serving-cert\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463813 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b22st\" (UniqueName: \"kubernetes.io/projected/041c5cf5-31c5-436f-b5c2-847ee3f3708c-kube-api-access-b22st\") pod \"dns-operator-744455d44c-m4q9d\" (UID: \"041c5cf5-31c5-436f-b5c2-847ee3f3708c\") " pod="openshift-dns-operator/dns-operator-744455d44c-m4q9d" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463832 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/397c9d40-0280-4491-8aa1-2f97f28d0b9e-audit-dir\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463860 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463881 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f9ac7305-8c6d-4fb3-900f-7c64bf930d38-machine-approver-tls\") pod \"machine-approver-56656f9798-vcl6w\" (UID: \"f9ac7305-8c6d-4fb3-900f-7c64bf930d38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463901 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d57b4c5c-1f9c-4317-8912-dfc20269e1af-service-ca-bundle\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463938 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.463958 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsbl5\" (UniqueName: \"kubernetes.io/projected/3e07175e-42fa-4065-adb1-3469e75ea4d8-kube-api-access-jsbl5\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.464005 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqf5x\" (UniqueName: \"kubernetes.io/projected/d57b4c5c-1f9c-4317-8912-dfc20269e1af-kube-api-access-xqf5x\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.464025 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1cf78e11-8113-4dd4-9b76-182452887bf3-audit-dir\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.464044 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-client-ca\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.464064 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: E1126 22:41:08.465053 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:08.965039056 +0000 UTC m=+144.377733058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.556657 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.564815 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565010 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3e07175e-42fa-4065-adb1-3469e75ea4d8-console-oauth-config\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: E1126 22:41:08.565099 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.065064036 +0000 UTC m=+144.477758098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565305 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb7aec8-881b-46d2-a40d-0758d1141f55-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fkn42\" (UID: \"3bb7aec8-881b-46d2-a40d-0758d1141f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565358 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565391 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3e07175e-42fa-4065-adb1-3469e75ea4d8-service-ca\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565423 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq8pf\" (UniqueName: \"kubernetes.io/projected/68c3e5aa-40f1-4600-951b-1c2460ed8466-kube-api-access-rq8pf\") pod \"migrator-59844c95c7-s2w2s\" (UID: \"68c3e5aa-40f1-4600-951b-1c2460ed8466\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s2w2s" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565455 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c02a7188-7359-43e4-aed9-dd67d2b42875-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7rb5v\" (UID: \"c02a7188-7359-43e4-aed9-dd67d2b42875\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565493 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e92e5871-b651-4219-ab37-6c973ea7fc92-serving-cert\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565518 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e07175e-42fa-4065-adb1-3469e75ea4d8-trusted-ca-bundle\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565547 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97073e75-5bda-4dc4-8d80-bf408068aaef-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rwq8l\" (UID: \"97073e75-5bda-4dc4-8d80-bf408068aaef\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565576 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-registration-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565607 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/967a2b95-097f-4abc-b5eb-e6642b7f2e97-serving-cert\") pod \"service-ca-operator-777779d784-6x274\" (UID: \"967a2b95-097f-4abc-b5eb-e6642b7f2e97\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565638 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-csi-data-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565664 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c50d9878-2526-49de-8461-8b077b1c688e-tmpfs\") pod \"packageserver-d55dfcdfc-l7qvj\" (UID: \"c50d9878-2526-49de-8461-8b077b1c688e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565694 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj4md\" (UniqueName: \"kubernetes.io/projected/e2e26305-8405-49e7-a5f6-611f769851d6-kube-api-access-gj4md\") pod \"openshift-apiserver-operator-796bbdcf4f-rz4wr\" (UID: \"e2e26305-8405-49e7-a5f6-611f769851d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565725 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90717c11-47b1-4265-8ea3-9c826850e812-trusted-ca\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565755 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8nwb\" (UniqueName: \"kubernetes.io/projected/5707c8e2-cffd-4f02-ab17-5eba261c57da-kube-api-access-b8nwb\") pod \"machine-config-server-47sdz\" (UID: \"5707c8e2-cffd-4f02-ab17-5eba261c57da\") " pod="openshift-machine-config-operator/machine-config-server-47sdz" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565783 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-registry-tls\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565813 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrx9f\" (UniqueName: \"kubernetes.io/projected/6f72cb1c-2994-4ed4-8333-b7498c1615bf-kube-api-access-wrx9f\") pod \"downloads-7954f5f757-t29mr\" (UID: \"6f72cb1c-2994-4ed4-8333-b7498c1615bf\") " pod="openshift-console/downloads-7954f5f757-t29mr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565844 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np7b2\" (UniqueName: \"kubernetes.io/projected/607a3fbf-3586-4c55-894a-f9107fc5679d-kube-api-access-np7b2\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565872 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c61407a-51aa-45ed-b80c-0376f72f01a8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-65x7g\" (UID: \"1c61407a-51aa-45ed-b80c-0376f72f01a8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565903 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-etcd-serving-ca\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.565932 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpwvx\" (UniqueName: \"kubernetes.io/projected/967a2b95-097f-4abc-b5eb-e6642b7f2e97-kube-api-access-vpwvx\") pod \"service-ca-operator-777779d784-6x274\" (UID: \"967a2b95-097f-4abc-b5eb-e6642b7f2e97\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566005 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566039 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltxgs\" (UniqueName: \"kubernetes.io/projected/75f43961-be76-4bce-9d0f-9841bdd21c06-kube-api-access-ltxgs\") pod \"package-server-manager-789f6589d5-t7hhl\" (UID: \"75f43961-be76-4bce-9d0f-9841bdd21c06\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566066 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bbadf2d2-48a3-499e-8dac-2ff8520cf311-signing-key\") pod \"service-ca-9c57cc56f-drtxl\" (UID: \"bbadf2d2-48a3-499e-8dac-2ff8520cf311\") " pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566101 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n2sb\" (UniqueName: \"kubernetes.io/projected/c02a7188-7359-43e4-aed9-dd67d2b42875-kube-api-access-9n2sb\") pod \"machine-api-operator-5694c8668f-7rb5v\" (UID: \"c02a7188-7359-43e4-aed9-dd67d2b42875\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566134 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-config\") pod \"route-controller-manager-6576b87f9c-9qz6q\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566163 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566198 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c97b71a3-678a-485a-a890-508df1e45bdf-config\") pod \"kube-apiserver-operator-766d6c64bb-9qsl7\" (UID: \"c97b71a3-678a-485a-a890-508df1e45bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566234 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2e26305-8405-49e7-a5f6-611f769851d6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rz4wr\" (UID: \"e2e26305-8405-49e7-a5f6-611f769851d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566264 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb7aec8-881b-46d2-a40d-0758d1141f55-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fkn42\" (UID: \"3bb7aec8-881b-46d2-a40d-0758d1141f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566296 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h4cg\" (UniqueName: \"kubernetes.io/projected/86121519-d315-4eb4-9cd0-ab26bcd23b42-kube-api-access-7h4cg\") pod \"machine-config-controller-84d6567774-jc57b\" (UID: \"86121519-d315-4eb4-9cd0-ab26bcd23b42\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566326 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56952069-b3c4-4136-94c0-fe90a0e21fcd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vzqh8\" (UID: \"56952069-b3c4-4136-94c0-fe90a0e21fcd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566353 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d7abe47-9752-4760-bbbf-5dca233eeb30-metrics-tls\") pod \"ingress-operator-5b745b69d9-nt29c\" (UID: \"7d7abe47-9752-4760-bbbf-5dca233eeb30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566382 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swb99\" (UniqueName: \"kubernetes.io/projected/7d7abe47-9752-4760-bbbf-5dca233eeb30-kube-api-access-swb99\") pod \"ingress-operator-5b745b69d9-nt29c\" (UID: \"7d7abe47-9752-4760-bbbf-5dca233eeb30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566406 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3bb7aec8-881b-46d2-a40d-0758d1141f55-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fkn42\" (UID: \"3bb7aec8-881b-46d2-a40d-0758d1141f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566440 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e92e5871-b651-4219-ab37-6c973ea7fc92-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.566469 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41d44cf9-3859-402a-9c77-d5842d7a70a3-secret-volume\") pod \"collect-profiles-29403270-9mdcq\" (UID: \"41d44cf9-3859-402a-9c77-d5842d7a70a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.567178 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c50d9878-2526-49de-8461-8b077b1c688e-apiservice-cert\") pod \"packageserver-d55dfcdfc-l7qvj\" (UID: \"c50d9878-2526-49de-8461-8b077b1c688e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.568695 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-config\") pod \"route-controller-manager-6576b87f9c-9qz6q\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569305 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e92e5871-b651-4219-ab37-6c973ea7fc92-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569363 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17969c13-4b9e-4a72-b81f-7160db060271-config\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569453 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bbadf2d2-48a3-499e-8dac-2ff8520cf311-signing-cabundle\") pod \"service-ca-9c57cc56f-drtxl\" (UID: \"bbadf2d2-48a3-499e-8dac-2ff8520cf311\") " pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569494 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-config\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569518 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3e07175e-42fa-4065-adb1-3469e75ea4d8-oauth-serving-cert\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569549 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569581 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89f9d155-58da-4221-ac74-ef11aa42fc6f-cert\") pod \"ingress-canary-w58c8\" (UID: \"89f9d155-58da-4221-ac74-ef11aa42fc6f\") " pod="openshift-ingress-canary/ingress-canary-w58c8" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569618 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlg9h\" (UniqueName: \"kubernetes.io/projected/bbadf2d2-48a3-499e-8dac-2ff8520cf311-kube-api-access-jlg9h\") pod \"service-ca-9c57cc56f-drtxl\" (UID: \"bbadf2d2-48a3-499e-8dac-2ff8520cf311\") " pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569648 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f56002e-5cd7-4eb0-9228-88e40e9a9942-trusted-ca\") pod \"console-operator-58897d9998-qdkcv\" (UID: \"4f56002e-5cd7-4eb0-9228-88e40e9a9942\") " pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569677 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76khw\" (UniqueName: \"kubernetes.io/projected/4dc3d8f0-7538-4163-a3f9-b7e7daa055d2-kube-api-access-76khw\") pod \"control-plane-machine-set-operator-78cbb6b69f-47bkz\" (UID: \"4dc3d8f0-7538-4163-a3f9-b7e7daa055d2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569710 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt88x\" (UniqueName: \"kubernetes.io/projected/dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59-kube-api-access-nt88x\") pod \"olm-operator-6b444d44fb-dcs8r\" (UID: \"dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569740 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569769 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p22sq\" (UniqueName: \"kubernetes.io/projected/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-kube-api-access-p22sq\") pod \"route-controller-manager-6576b87f9c-9qz6q\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569791 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8z8d\" (UniqueName: \"kubernetes.io/projected/1cf78e11-8113-4dd4-9b76-182452887bf3-kube-api-access-c8z8d\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.569929 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-registry-tls\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570121 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e92e5871-b651-4219-ab37-6c973ea7fc92-serving-cert\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570491 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b22st\" (UniqueName: \"kubernetes.io/projected/041c5cf5-31c5-436f-b5c2-847ee3f3708c-kube-api-access-b22st\") pod \"dns-operator-744455d44c-m4q9d\" (UID: \"041c5cf5-31c5-436f-b5c2-847ee3f3708c\") " pod="openshift-dns-operator/dns-operator-744455d44c-m4q9d" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570559 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/397c9d40-0280-4491-8aa1-2f97f28d0b9e-audit-dir\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570593 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41d44cf9-3859-402a-9c77-d5842d7a70a3-config-volume\") pod \"collect-profiles-29403270-9mdcq\" (UID: \"41d44cf9-3859-402a-9c77-d5842d7a70a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570631 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570668 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d57b4c5c-1f9c-4317-8912-dfc20269e1af-service-ca-bundle\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570699 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570724 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsbl5\" (UniqueName: \"kubernetes.io/projected/3e07175e-42fa-4065-adb1-3469e75ea4d8-kube-api-access-jsbl5\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570756 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1cf78e11-8113-4dd4-9b76-182452887bf3-audit-dir\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570833 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5707c8e2-cffd-4f02-ab17-5eba261c57da-certs\") pod \"machine-config-server-47sdz\" (UID: \"5707c8e2-cffd-4f02-ab17-5eba261c57da\") " pod="openshift-machine-config-operator/machine-config-server-47sdz" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570865 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e07175e-42fa-4065-adb1-3469e75ea4d8-trusted-ca-bundle\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570878 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570914 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/520338ff-81f2-44c4-9c0f-0d73bd42984c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pwq7l\" (UID: \"520338ff-81f2-44c4-9c0f-0d73bd42984c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570945 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ql8q\" (UniqueName: \"kubernetes.io/projected/cfe3f7c6-e363-4dd7-a52a-e5d43134aba8-kube-api-access-2ql8q\") pod \"machine-config-operator-74547568cd-htt4l\" (UID: \"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.570993 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bl7t\" (UniqueName: \"kubernetes.io/projected/e92e5871-b651-4219-ab37-6c973ea7fc92-kube-api-access-5bl7t\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.571021 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgzt9\" (UniqueName: \"kubernetes.io/projected/f9ac7305-8c6d-4fb3-900f-7c64bf930d38-kube-api-access-cgzt9\") pod \"machine-approver-56656f9798-vcl6w\" (UID: \"f9ac7305-8c6d-4fb3-900f-7c64bf930d38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.571051 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ac7305-8c6d-4fb3-900f-7c64bf930d38-config\") pod \"machine-approver-56656f9798-vcl6w\" (UID: \"f9ac7305-8c6d-4fb3-900f-7c64bf930d38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.571079 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607a3fbf-3586-4c55-894a-f9107fc5679d-serving-cert\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.571129 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90717c11-47b1-4265-8ea3-9c826850e812-trusted-ca\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.571150 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrf7r\" (UniqueName: \"kubernetes.io/projected/89f9d155-58da-4221-ac74-ef11aa42fc6f-kube-api-access-lrf7r\") pod \"ingress-canary-w58c8\" (UID: \"89f9d155-58da-4221-ac74-ef11aa42fc6f\") " pod="openshift-ingress-canary/ingress-canary-w58c8" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.571181 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52ab88e6-8cc7-4093-a550-570320f5e62b-config-volume\") pod \"dns-default-nzkzs\" (UID: \"52ab88e6-8cc7-4093-a550-570320f5e62b\") " pod="openshift-dns/dns-default-nzkzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.571222 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/90717c11-47b1-4265-8ea3-9c826850e812-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.571804 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-etcd-serving-ca\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.571955 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-config\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572028 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-client-ca\") pod \"route-controller-manager-6576b87f9c-9qz6q\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572118 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572177 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/218622cd-c88c-44d8-b1f9-5090c8957212-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bv6bq\" (UID: \"218622cd-c88c-44d8-b1f9-5090c8957212\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572216 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-plugins-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572268 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/17969c13-4b9e-4a72-b81f-7160db060271-etcd-service-ca\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572313 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/397c9d40-0280-4491-8aa1-2f97f28d0b9e-encryption-config\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572357 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/75f43961-be76-4bce-9d0f-9841bdd21c06-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-t7hhl\" (UID: \"75f43961-be76-4bce-9d0f-9841bdd21c06\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572404 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d57b4c5c-1f9c-4317-8912-dfc20269e1af-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572450 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxg8f\" (UniqueName: \"kubernetes.io/projected/397c9d40-0280-4491-8aa1-2f97f28d0b9e-kube-api-access-hxg8f\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572495 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cfe3f7c6-e363-4dd7-a52a-e5d43134aba8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-htt4l\" (UID: \"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572538 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86121519-d315-4eb4-9cd0-ab26bcd23b42-proxy-tls\") pod \"machine-config-controller-84d6567774-jc57b\" (UID: \"86121519-d315-4eb4-9cd0-ab26bcd23b42\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572571 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d7abe47-9752-4760-bbbf-5dca233eeb30-trusted-ca\") pod \"ingress-operator-5b745b69d9-nt29c\" (UID: \"7d7abe47-9752-4760-bbbf-5dca233eeb30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572615 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-bound-sa-token\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572658 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-serving-cert\") pod \"route-controller-manager-6576b87f9c-9qz6q\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572680 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/90717c11-47b1-4265-8ea3-9c826850e812-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572707 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/041c5cf5-31c5-436f-b5c2-847ee3f3708c-metrics-tls\") pod \"dns-operator-744455d44c-m4q9d\" (UID: \"041c5cf5-31c5-436f-b5c2-847ee3f3708c\") " pod="openshift-dns-operator/dns-operator-744455d44c-m4q9d" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572749 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfe3f7c6-e363-4dd7-a52a-e5d43134aba8-proxy-tls\") pod \"machine-config-operator-74547568cd-htt4l\" (UID: \"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572790 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59-srv-cert\") pod \"olm-operator-6b444d44fb-dcs8r\" (UID: \"dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572837 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq8sd\" (UniqueName: \"kubernetes.io/projected/17969c13-4b9e-4a72-b81f-7160db060271-kube-api-access-lq8sd\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.572885 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c02a7188-7359-43e4-aed9-dd67d2b42875-images\") pod \"machine-api-operator-5694c8668f-7rb5v\" (UID: \"c02a7188-7359-43e4-aed9-dd67d2b42875\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573049 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17969c13-4b9e-4a72-b81f-7160db060271-config\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573203 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573226 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rbfd\" (UniqueName: \"kubernetes.io/projected/743cf3d7-7296-4b87-94e3-67fc66dacca7-kube-api-access-6rbfd\") pod \"cluster-samples-operator-665b6dd947-p5xvf\" (UID: \"743cf3d7-7296-4b87-94e3-67fc66dacca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573281 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss6r8\" (UniqueName: \"kubernetes.io/projected/52ab88e6-8cc7-4093-a550-570320f5e62b-kube-api-access-ss6r8\") pod \"dns-default-nzkzs\" (UID: \"52ab88e6-8cc7-4093-a550-570320f5e62b\") " pod="openshift-dns/dns-default-nzkzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573323 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/743cf3d7-7296-4b87-94e3-67fc66dacca7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p5xvf\" (UID: \"743cf3d7-7296-4b87-94e3-67fc66dacca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573370 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f9ac7305-8c6d-4fb3-900f-7c64bf930d38-auth-proxy-config\") pod \"machine-approver-56656f9798-vcl6w\" (UID: \"f9ac7305-8c6d-4fb3-900f-7c64bf930d38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573520 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d57b4c5c-1f9c-4317-8912-dfc20269e1af-serving-cert\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573568 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/397c9d40-0280-4491-8aa1-2f97f28d0b9e-etcd-client\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573607 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1cf78e11-8113-4dd4-9b76-182452887bf3-audit-dir\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573611 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56952069-b3c4-4136-94c0-fe90a0e21fcd-config\") pod \"kube-controller-manager-operator-78b949d7b-vzqh8\" (UID: \"56952069-b3c4-4136-94c0-fe90a0e21fcd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573680 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5707c8e2-cffd-4f02-ab17-5eba261c57da-node-bootstrap-token\") pod \"machine-config-server-47sdz\" (UID: \"5707c8e2-cffd-4f02-ab17-5eba261c57da\") " pod="openshift-machine-config-operator/machine-config-server-47sdz" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573688 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573716 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/86121519-d315-4eb4-9cd0-ab26bcd23b42-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jc57b\" (UID: \"86121519-d315-4eb4-9cd0-ab26bcd23b42\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573747 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ac7305-8c6d-4fb3-900f-7c64bf930d38-config\") pod \"machine-approver-56656f9798-vcl6w\" (UID: \"f9ac7305-8c6d-4fb3-900f-7c64bf930d38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573797 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whghj\" (UniqueName: \"kubernetes.io/projected/520338ff-81f2-44c4-9c0f-0d73bd42984c-kube-api-access-whghj\") pod \"openshift-controller-manager-operator-756b6f6bc6-pwq7l\" (UID: \"520338ff-81f2-44c4-9c0f-0d73bd42984c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573859 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c02a7188-7359-43e4-aed9-dd67d2b42875-config\") pod \"machine-api-operator-5694c8668f-7rb5v\" (UID: \"c02a7188-7359-43e4-aed9-dd67d2b42875\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573942 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574012 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574048 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-audit\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574094 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7csd\" (UniqueName: \"kubernetes.io/projected/97073e75-5bda-4dc4-8d80-bf408068aaef-kube-api-access-s7csd\") pod \"marketplace-operator-79b997595-rwq8l\" (UID: \"97073e75-5bda-4dc4-8d80-bf408068aaef\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574130 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54wwj\" (UniqueName: \"kubernetes.io/projected/4f56002e-5cd7-4eb0-9228-88e40e9a9942-kube-api-access-54wwj\") pod \"console-operator-58897d9998-qdkcv\" (UID: \"4f56002e-5cd7-4eb0-9228-88e40e9a9942\") " pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574187 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cfe3f7c6-e363-4dd7-a52a-e5d43134aba8-images\") pod \"machine-config-operator-74547568cd-htt4l\" (UID: \"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574220 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97073e75-5bda-4dc4-8d80-bf408068aaef-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rwq8l\" (UID: \"97073e75-5bda-4dc4-8d80-bf408068aaef\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574259 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwsxn\" (UniqueName: \"kubernetes.io/projected/4848d003-6c90-4cc0-ae0e-1563303c80db-kube-api-access-bwsxn\") pod \"catalog-operator-68c6474976-k2f7k\" (UID: \"4848d003-6c90-4cc0-ae0e-1563303c80db\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574295 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52ab88e6-8cc7-4093-a550-570320f5e62b-metrics-tls\") pod \"dns-default-nzkzs\" (UID: \"52ab88e6-8cc7-4093-a550-570320f5e62b\") " pod="openshift-dns/dns-default-nzkzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574351 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c61407a-51aa-45ed-b80c-0376f72f01a8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-65x7g\" (UID: \"1c61407a-51aa-45ed-b80c-0376f72f01a8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574411 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f56002e-5cd7-4eb0-9228-88e40e9a9942-config\") pod \"console-operator-58897d9998-qdkcv\" (UID: \"4f56002e-5cd7-4eb0-9228-88e40e9a9942\") " pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574423 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/520338ff-81f2-44c4-9c0f-0d73bd42984c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pwq7l\" (UID: \"520338ff-81f2-44c4-9c0f-0d73bd42984c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574455 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f56002e-5cd7-4eb0-9228-88e40e9a9942-serving-cert\") pod \"console-operator-58897d9998-qdkcv\" (UID: \"4f56002e-5cd7-4eb0-9228-88e40e9a9942\") " pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574498 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-image-import-ca\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574540 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsds5\" (UniqueName: \"kubernetes.io/projected/41d44cf9-3859-402a-9c77-d5842d7a70a3-kube-api-access-zsds5\") pod \"collect-profiles-29403270-9mdcq\" (UID: \"41d44cf9-3859-402a-9c77-d5842d7a70a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574625 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c02a7188-7359-43e4-aed9-dd67d2b42875-images\") pod \"machine-api-operator-5694c8668f-7rb5v\" (UID: \"c02a7188-7359-43e4-aed9-dd67d2b42875\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.574995 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e92e5871-b651-4219-ab37-6c973ea7fc92-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: E1126 22:41:08.575027 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.075011054 +0000 UTC m=+144.487705066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.577276 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/397c9d40-0280-4491-8aa1-2f97f28d0b9e-audit-dir\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.578289 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.578587 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3e07175e-42fa-4065-adb1-3469e75ea4d8-service-ca\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.578788 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f56002e-5cd7-4eb0-9228-88e40e9a9942-trusted-ca\") pod \"console-operator-58897d9998-qdkcv\" (UID: \"4f56002e-5cd7-4eb0-9228-88e40e9a9942\") " pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.580203 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/17969c13-4b9e-4a72-b81f-7160db060271-etcd-service-ca\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.580400 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/397c9d40-0280-4491-8aa1-2f97f28d0b9e-etcd-client\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581199 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e92e5871-b651-4219-ab37-6c973ea7fc92-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581290 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581339 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/397c9d40-0280-4491-8aa1-2f97f28d0b9e-node-pullsecrets\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581371 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d7abe47-9752-4760-bbbf-5dca233eeb30-bound-sa-token\") pod \"ingress-operator-5b745b69d9-nt29c\" (UID: \"7d7abe47-9752-4760-bbbf-5dca233eeb30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581401 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c61407a-51aa-45ed-b80c-0376f72f01a8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-65x7g\" (UID: \"1c61407a-51aa-45ed-b80c-0376f72f01a8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581411 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4848d003-6c90-4cc0-ae0e-1563303c80db-profile-collector-cert\") pod \"catalog-operator-68c6474976-k2f7k\" (UID: \"4848d003-6c90-4cc0-ae0e-1563303c80db\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581418 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581453 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/90717c11-47b1-4265-8ea3-9c826850e812-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581511 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e07175e-42fa-4065-adb1-3469e75ea4d8-console-serving-cert\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581533 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-config\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581556 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-socket-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581634 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-mountpoint-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581703 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57b4c5c-1f9c-4317-8912-dfc20269e1af-config\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581750 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/17969c13-4b9e-4a72-b81f-7160db060271-etcd-ca\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581809 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-image-import-ca\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581819 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dcs8r\" (UID: \"dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.581915 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e92e5871-b651-4219-ab37-6c973ea7fc92-audit-dir\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.582043 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/397c9d40-0280-4491-8aa1-2f97f28d0b9e-serving-cert\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.582514 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-audit\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.583721 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c02a7188-7359-43e4-aed9-dd67d2b42875-config\") pod \"machine-api-operator-5694c8668f-7rb5v\" (UID: \"c02a7188-7359-43e4-aed9-dd67d2b42875\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.584082 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.584338 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.584786 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d57b4c5c-1f9c-4317-8912-dfc20269e1af-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.573566 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d57b4c5c-1f9c-4317-8912-dfc20269e1af-service-ca-bundle\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.584925 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3e07175e-42fa-4065-adb1-3469e75ea4d8-console-oauth-config\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.585499 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-client-ca\") pod \"route-controller-manager-6576b87f9c-9qz6q\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.585518 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/397c9d40-0280-4491-8aa1-2f97f28d0b9e-node-pullsecrets\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.586370 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/17969c13-4b9e-4a72-b81f-7160db060271-etcd-ca\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.586424 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e92e5871-b651-4219-ab37-6c973ea7fc92-audit-dir\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.587029 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/397c9d40-0280-4491-8aa1-2f97f28d0b9e-config\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.587579 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f9ac7305-8c6d-4fb3-900f-7c64bf930d38-auth-proxy-config\") pod \"machine-approver-56656f9798-vcl6w\" (UID: \"f9ac7305-8c6d-4fb3-900f-7c64bf930d38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.588085 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-audit-policies\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.588195 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3e07175e-42fa-4065-adb1-3469e75ea4d8-oauth-serving-cert\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.588359 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.588449 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f56002e-5cd7-4eb0-9228-88e40e9a9942-config\") pod \"console-operator-58897d9998-qdkcv\" (UID: \"4f56002e-5cd7-4eb0-9228-88e40e9a9942\") " pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.589386 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-audit-policies\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.589385 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57b4c5c-1f9c-4317-8912-dfc20269e1af-config\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.590377 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f56002e-5cd7-4eb0-9228-88e40e9a9942-serving-cert\") pod \"console-operator-58897d9998-qdkcv\" (UID: \"4f56002e-5cd7-4eb0-9228-88e40e9a9942\") " pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.590549 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/743cf3d7-7296-4b87-94e3-67fc66dacca7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p5xvf\" (UID: \"743cf3d7-7296-4b87-94e3-67fc66dacca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.590792 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-serving-cert\") pod \"route-controller-manager-6576b87f9c-9qz6q\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.591721 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.592213 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/90717c11-47b1-4265-8ea3-9c826850e812-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.593249 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607a3fbf-3586-4c55-894a-f9107fc5679d-serving-cert\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.593449 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c02a7188-7359-43e4-aed9-dd67d2b42875-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7rb5v\" (UID: \"c02a7188-7359-43e4-aed9-dd67d2b42875\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.592435 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/967a2b95-097f-4abc-b5eb-e6642b7f2e97-config\") pod \"service-ca-operator-777779d784-6x274\" (UID: \"967a2b95-097f-4abc-b5eb-e6642b7f2e97\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.595628 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfz74\" (UniqueName: \"kubernetes.io/projected/218622cd-c88c-44d8-b1f9-5090c8957212-kube-api-access-zfz74\") pod \"kube-storage-version-migrator-operator-b67b599dd-bv6bq\" (UID: \"218622cd-c88c-44d8-b1f9-5090c8957212\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.595842 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.596110 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/397c9d40-0280-4491-8aa1-2f97f28d0b9e-serving-cert\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.596324 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17969c13-4b9e-4a72-b81f-7160db060271-serving-cert\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.596646 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c97b71a3-678a-485a-a890-508df1e45bdf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9qsl7\" (UID: \"c97b71a3-678a-485a-a890-508df1e45bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.596885 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78fhd\" (UniqueName: \"kubernetes.io/projected/c50d9878-2526-49de-8461-8b077b1c688e-kube-api-access-78fhd\") pod \"packageserver-d55dfcdfc-l7qvj\" (UID: \"c50d9878-2526-49de-8461-8b077b1c688e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.596759 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.595993 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e07175e-42fa-4065-adb1-3469e75ea4d8-console-serving-cert\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.596715 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.597208 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f9ac7305-8c6d-4fb3-900f-7c64bf930d38-machine-approver-tls\") pod \"machine-approver-56656f9798-vcl6w\" (UID: \"f9ac7305-8c6d-4fb3-900f-7c64bf930d38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.597807 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.597473 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/397c9d40-0280-4491-8aa1-2f97f28d0b9e-encryption-config\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.598119 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d57b4c5c-1f9c-4317-8912-dfc20269e1af-serving-cert\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.599254 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/218622cd-c88c-44d8-b1f9-5090c8957212-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bv6bq\" (UID: \"218622cd-c88c-44d8-b1f9-5090c8957212\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.599382 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb8tk\" (UniqueName: \"kubernetes.io/projected/e547d918-c76b-4a9a-8717-ee227b89818d-kube-api-access-vb8tk\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.599461 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c50d9878-2526-49de-8461-8b077b1c688e-webhook-cert\") pod \"packageserver-d55dfcdfc-l7qvj\" (UID: \"c50d9878-2526-49de-8461-8b077b1c688e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.599532 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqf5x\" (UniqueName: \"kubernetes.io/projected/d57b4c5c-1f9c-4317-8912-dfc20269e1af-kube-api-access-xqf5x\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.599586 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-metrics-certs\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.599626 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-service-ca-bundle\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.599661 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-client-ca\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.599694 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm2c2\" (UniqueName: \"kubernetes.io/projected/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-kube-api-access-qm2c2\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.599728 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c61407a-51aa-45ed-b80c-0376f72f01a8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-65x7g\" (UID: \"1c61407a-51aa-45ed-b80c-0376f72f01a8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.599955 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3e07175e-42fa-4065-adb1-3469e75ea4d8-console-config\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600072 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/520338ff-81f2-44c4-9c0f-0d73bd42984c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pwq7l\" (UID: \"520338ff-81f2-44c4-9c0f-0d73bd42984c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600124 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56952069-b3c4-4136-94c0-fe90a0e21fcd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vzqh8\" (UID: \"56952069-b3c4-4136-94c0-fe90a0e21fcd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600176 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e92e5871-b651-4219-ab37-6c973ea7fc92-encryption-config\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600296 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600344 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c97b71a3-678a-485a-a890-508df1e45bdf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9qsl7\" (UID: \"c97b71a3-678a-485a-a890-508df1e45bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600397 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/90717c11-47b1-4265-8ea3-9c826850e812-registry-certificates\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600436 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfrqw\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-kube-api-access-cfrqw\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600470 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e92e5871-b651-4219-ab37-6c973ea7fc92-etcd-client\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600518 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-default-certificate\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600554 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grls8\" (UniqueName: \"kubernetes.io/projected/1c61407a-51aa-45ed-b80c-0376f72f01a8-kube-api-access-grls8\") pod \"cluster-image-registry-operator-dc59b4c8b-65x7g\" (UID: \"1c61407a-51aa-45ed-b80c-0376f72f01a8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600587 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2e26305-8405-49e7-a5f6-611f769851d6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rz4wr\" (UID: \"e2e26305-8405-49e7-a5f6-611f769851d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600612 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4dc3d8f0-7538-4163-a3f9-b7e7daa055d2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-47bkz\" (UID: \"4dc3d8f0-7538-4163-a3f9-b7e7daa055d2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600651 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwk5g\" (UniqueName: \"kubernetes.io/projected/0337812a-7694-45b8-92de-1740afe87f27-kube-api-access-kwk5g\") pod \"multus-admission-controller-857f4d67dd-4d4wh\" (UID: \"0337812a-7694-45b8-92de-1740afe87f27\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4d4wh" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600683 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/17969c13-4b9e-4a72-b81f-7160db060271-etcd-client\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600712 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0337812a-7694-45b8-92de-1740afe87f27-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4d4wh\" (UID: \"0337812a-7694-45b8-92de-1740afe87f27\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4d4wh" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600746 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-stats-auth\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600772 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e92e5871-b651-4219-ab37-6c973ea7fc92-audit-policies\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.600808 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4848d003-6c90-4cc0-ae0e-1563303c80db-srv-cert\") pod \"catalog-operator-68c6474976-k2f7k\" (UID: \"4848d003-6c90-4cc0-ae0e-1563303c80db\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.601586 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2e26305-8405-49e7-a5f6-611f769851d6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rz4wr\" (UID: \"e2e26305-8405-49e7-a5f6-611f769851d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.603629 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-client-ca\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.604473 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e92e5871-b651-4219-ab37-6c973ea7fc92-audit-policies\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.604643 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f9ac7305-8c6d-4fb3-900f-7c64bf930d38-machine-approver-tls\") pod \"machine-approver-56656f9798-vcl6w\" (UID: \"f9ac7305-8c6d-4fb3-900f-7c64bf930d38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.606208 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3e07175e-42fa-4065-adb1-3469e75ea4d8-console-config\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.607264 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e92e5871-b651-4219-ab37-6c973ea7fc92-encryption-config\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.607653 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/90717c11-47b1-4265-8ea3-9c826850e812-registry-certificates\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.607727 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/041c5cf5-31c5-436f-b5c2-847ee3f3708c-metrics-tls\") pod \"dns-operator-744455d44c-m4q9d\" (UID: \"041c5cf5-31c5-436f-b5c2-847ee3f3708c\") " pod="openshift-dns-operator/dns-operator-744455d44c-m4q9d" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.607896 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/17969c13-4b9e-4a72-b81f-7160db060271-etcd-client\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.608237 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c61407a-51aa-45ed-b80c-0376f72f01a8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-65x7g\" (UID: \"1c61407a-51aa-45ed-b80c-0376f72f01a8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.608579 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.608590 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e92e5871-b651-4219-ab37-6c973ea7fc92-etcd-client\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.608749 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17969c13-4b9e-4a72-b81f-7160db060271-serving-cert\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.608932 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2e26305-8405-49e7-a5f6-611f769851d6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rz4wr\" (UID: \"e2e26305-8405-49e7-a5f6-611f769851d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.609693 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/520338ff-81f2-44c4-9c0f-0d73bd42984c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pwq7l\" (UID: \"520338ff-81f2-44c4-9c0f-0d73bd42984c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.611366 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj4md\" (UniqueName: \"kubernetes.io/projected/e2e26305-8405-49e7-a5f6-611f769851d6-kube-api-access-gj4md\") pod \"openshift-apiserver-operator-796bbdcf4f-rz4wr\" (UID: \"e2e26305-8405-49e7-a5f6-611f769851d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.611470 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.631068 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c61407a-51aa-45ed-b80c-0376f72f01a8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-65x7g\" (UID: \"1c61407a-51aa-45ed-b80c-0376f72f01a8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.649266 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p22sq\" (UniqueName: \"kubernetes.io/projected/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-kube-api-access-p22sq\") pod \"route-controller-manager-6576b87f9c-9qz6q\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.664930 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8z8d\" (UniqueName: \"kubernetes.io/projected/1cf78e11-8113-4dd4-9b76-182452887bf3-kube-api-access-c8z8d\") pod \"oauth-openshift-558db77b4-k77kt\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.686522 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b22st\" (UniqueName: \"kubernetes.io/projected/041c5cf5-31c5-436f-b5c2-847ee3f3708c-kube-api-access-b22st\") pod \"dns-operator-744455d44c-m4q9d\" (UID: \"041c5cf5-31c5-436f-b5c2-847ee3f3708c\") " pod="openshift-dns-operator/dns-operator-744455d44c-m4q9d" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.705341 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:08 crc kubenswrapper[5008]: E1126 22:41:08.705559 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.205504144 +0000 UTC m=+144.618198146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.705637 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41d44cf9-3859-402a-9c77-d5842d7a70a3-secret-volume\") pod \"collect-profiles-29403270-9mdcq\" (UID: \"41d44cf9-3859-402a-9c77-d5842d7a70a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.705674 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c50d9878-2526-49de-8461-8b077b1c688e-apiservice-cert\") pod \"packageserver-d55dfcdfc-l7qvj\" (UID: \"c50d9878-2526-49de-8461-8b077b1c688e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.705694 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrx9f\" (UniqueName: \"kubernetes.io/projected/6f72cb1c-2994-4ed4-8333-b7498c1615bf-kube-api-access-wrx9f\") pod \"downloads-7954f5f757-t29mr\" (UID: \"6f72cb1c-2994-4ed4-8333-b7498c1615bf\") " pod="openshift-console/downloads-7954f5f757-t29mr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.705746 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bbadf2d2-48a3-499e-8dac-2ff8520cf311-signing-cabundle\") pod \"service-ca-9c57cc56f-drtxl\" (UID: \"bbadf2d2-48a3-499e-8dac-2ff8520cf311\") " pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.705788 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89f9d155-58da-4221-ac74-ef11aa42fc6f-cert\") pod \"ingress-canary-w58c8\" (UID: \"89f9d155-58da-4221-ac74-ef11aa42fc6f\") " pod="openshift-ingress-canary/ingress-canary-w58c8" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.705815 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlg9h\" (UniqueName: \"kubernetes.io/projected/bbadf2d2-48a3-499e-8dac-2ff8520cf311-kube-api-access-jlg9h\") pod \"service-ca-9c57cc56f-drtxl\" (UID: \"bbadf2d2-48a3-499e-8dac-2ff8520cf311\") " pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.705837 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76khw\" (UniqueName: \"kubernetes.io/projected/4dc3d8f0-7538-4163-a3f9-b7e7daa055d2-kube-api-access-76khw\") pod \"control-plane-machine-set-operator-78cbb6b69f-47bkz\" (UID: \"4dc3d8f0-7538-4163-a3f9-b7e7daa055d2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.705879 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt88x\" (UniqueName: \"kubernetes.io/projected/dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59-kube-api-access-nt88x\") pod \"olm-operator-6b444d44fb-dcs8r\" (UID: \"dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.705904 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41d44cf9-3859-402a-9c77-d5842d7a70a3-config-volume\") pod \"collect-profiles-29403270-9mdcq\" (UID: \"41d44cf9-3859-402a-9c77-d5842d7a70a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.705997 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706069 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5707c8e2-cffd-4f02-ab17-5eba261c57da-certs\") pod \"machine-config-server-47sdz\" (UID: \"5707c8e2-cffd-4f02-ab17-5eba261c57da\") " pod="openshift-machine-config-operator/machine-config-server-47sdz" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706099 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ql8q\" (UniqueName: \"kubernetes.io/projected/cfe3f7c6-e363-4dd7-a52a-e5d43134aba8-kube-api-access-2ql8q\") pod \"machine-config-operator-74547568cd-htt4l\" (UID: \"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706143 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrf7r\" (UniqueName: \"kubernetes.io/projected/89f9d155-58da-4221-ac74-ef11aa42fc6f-kube-api-access-lrf7r\") pod \"ingress-canary-w58c8\" (UID: \"89f9d155-58da-4221-ac74-ef11aa42fc6f\") " pod="openshift-ingress-canary/ingress-canary-w58c8" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706159 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52ab88e6-8cc7-4093-a550-570320f5e62b-config-volume\") pod \"dns-default-nzkzs\" (UID: \"52ab88e6-8cc7-4093-a550-570320f5e62b\") " pod="openshift-dns/dns-default-nzkzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706180 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/218622cd-c88c-44d8-b1f9-5090c8957212-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bv6bq\" (UID: \"218622cd-c88c-44d8-b1f9-5090c8957212\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706197 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-plugins-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706214 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/75f43961-be76-4bce-9d0f-9841bdd21c06-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-t7hhl\" (UID: \"75f43961-be76-4bce-9d0f-9841bdd21c06\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706238 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cfe3f7c6-e363-4dd7-a52a-e5d43134aba8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-htt4l\" (UID: \"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706254 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86121519-d315-4eb4-9cd0-ab26bcd23b42-proxy-tls\") pod \"machine-config-controller-84d6567774-jc57b\" (UID: \"86121519-d315-4eb4-9cd0-ab26bcd23b42\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706271 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d7abe47-9752-4760-bbbf-5dca233eeb30-trusted-ca\") pod \"ingress-operator-5b745b69d9-nt29c\" (UID: \"7d7abe47-9752-4760-bbbf-5dca233eeb30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706302 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfe3f7c6-e363-4dd7-a52a-e5d43134aba8-proxy-tls\") pod \"machine-config-operator-74547568cd-htt4l\" (UID: \"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706327 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59-srv-cert\") pod \"olm-operator-6b444d44fb-dcs8r\" (UID: \"dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706364 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss6r8\" (UniqueName: \"kubernetes.io/projected/52ab88e6-8cc7-4093-a550-570320f5e62b-kube-api-access-ss6r8\") pod \"dns-default-nzkzs\" (UID: \"52ab88e6-8cc7-4093-a550-570320f5e62b\") " pod="openshift-dns/dns-default-nzkzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706384 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56952069-b3c4-4136-94c0-fe90a0e21fcd-config\") pod \"kube-controller-manager-operator-78b949d7b-vzqh8\" (UID: \"56952069-b3c4-4136-94c0-fe90a0e21fcd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706399 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5707c8e2-cffd-4f02-ab17-5eba261c57da-node-bootstrap-token\") pod \"machine-config-server-47sdz\" (UID: \"5707c8e2-cffd-4f02-ab17-5eba261c57da\") " pod="openshift-machine-config-operator/machine-config-server-47sdz" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706422 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/86121519-d315-4eb4-9cd0-ab26bcd23b42-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jc57b\" (UID: \"86121519-d315-4eb4-9cd0-ab26bcd23b42\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706447 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7csd\" (UniqueName: \"kubernetes.io/projected/97073e75-5bda-4dc4-8d80-bf408068aaef-kube-api-access-s7csd\") pod \"marketplace-operator-79b997595-rwq8l\" (UID: \"97073e75-5bda-4dc4-8d80-bf408068aaef\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706475 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cfe3f7c6-e363-4dd7-a52a-e5d43134aba8-images\") pod \"machine-config-operator-74547568cd-htt4l\" (UID: \"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706500 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97073e75-5bda-4dc4-8d80-bf408068aaef-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rwq8l\" (UID: \"97073e75-5bda-4dc4-8d80-bf408068aaef\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706517 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwsxn\" (UniqueName: \"kubernetes.io/projected/4848d003-6c90-4cc0-ae0e-1563303c80db-kube-api-access-bwsxn\") pod \"catalog-operator-68c6474976-k2f7k\" (UID: \"4848d003-6c90-4cc0-ae0e-1563303c80db\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706532 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52ab88e6-8cc7-4093-a550-570320f5e62b-metrics-tls\") pod \"dns-default-nzkzs\" (UID: \"52ab88e6-8cc7-4093-a550-570320f5e62b\") " pod="openshift-dns/dns-default-nzkzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706549 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsds5\" (UniqueName: \"kubernetes.io/projected/41d44cf9-3859-402a-9c77-d5842d7a70a3-kube-api-access-zsds5\") pod \"collect-profiles-29403270-9mdcq\" (UID: \"41d44cf9-3859-402a-9c77-d5842d7a70a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706568 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d7abe47-9752-4760-bbbf-5dca233eeb30-bound-sa-token\") pod \"ingress-operator-5b745b69d9-nt29c\" (UID: \"7d7abe47-9752-4760-bbbf-5dca233eeb30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706586 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4848d003-6c90-4cc0-ae0e-1563303c80db-profile-collector-cert\") pod \"catalog-operator-68c6474976-k2f7k\" (UID: \"4848d003-6c90-4cc0-ae0e-1563303c80db\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706601 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-socket-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706615 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-mountpoint-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706633 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dcs8r\" (UID: \"dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706654 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/967a2b95-097f-4abc-b5eb-e6642b7f2e97-config\") pod \"service-ca-operator-777779d784-6x274\" (UID: \"967a2b95-097f-4abc-b5eb-e6642b7f2e97\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706670 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfz74\" (UniqueName: \"kubernetes.io/projected/218622cd-c88c-44d8-b1f9-5090c8957212-kube-api-access-zfz74\") pod \"kube-storage-version-migrator-operator-b67b599dd-bv6bq\" (UID: \"218622cd-c88c-44d8-b1f9-5090c8957212\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706689 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c97b71a3-678a-485a-a890-508df1e45bdf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9qsl7\" (UID: \"c97b71a3-678a-485a-a890-508df1e45bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706705 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78fhd\" (UniqueName: \"kubernetes.io/projected/c50d9878-2526-49de-8461-8b077b1c688e-kube-api-access-78fhd\") pod \"packageserver-d55dfcdfc-l7qvj\" (UID: \"c50d9878-2526-49de-8461-8b077b1c688e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:08 crc kubenswrapper[5008]: E1126 22:41:08.706748 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.206722757 +0000 UTC m=+144.619416819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706805 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/218622cd-c88c-44d8-b1f9-5090c8957212-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bv6bq\" (UID: \"218622cd-c88c-44d8-b1f9-5090c8957212\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706860 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb8tk\" (UniqueName: \"kubernetes.io/projected/e547d918-c76b-4a9a-8717-ee227b89818d-kube-api-access-vb8tk\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706896 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c50d9878-2526-49de-8461-8b077b1c688e-webhook-cert\") pod \"packageserver-d55dfcdfc-l7qvj\" (UID: \"c50d9878-2526-49de-8461-8b077b1c688e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.706990 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-metrics-certs\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.707025 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-service-ca-bundle\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.707056 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm2c2\" (UniqueName: \"kubernetes.io/projected/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-kube-api-access-qm2c2\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.707087 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bbadf2d2-48a3-499e-8dac-2ff8520cf311-signing-cabundle\") pod \"service-ca-9c57cc56f-drtxl\" (UID: \"bbadf2d2-48a3-499e-8dac-2ff8520cf311\") " pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.707107 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56952069-b3c4-4136-94c0-fe90a0e21fcd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vzqh8\" (UID: \"56952069-b3c4-4136-94c0-fe90a0e21fcd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.707145 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c97b71a3-678a-485a-a890-508df1e45bdf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9qsl7\" (UID: \"c97b71a3-678a-485a-a890-508df1e45bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.707179 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41d44cf9-3859-402a-9c77-d5842d7a70a3-config-volume\") pod \"collect-profiles-29403270-9mdcq\" (UID: \"41d44cf9-3859-402a-9c77-d5842d7a70a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.708015 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/86121519-d315-4eb4-9cd0-ab26bcd23b42-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jc57b\" (UID: \"86121519-d315-4eb4-9cd0-ab26bcd23b42\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.708374 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c50d9878-2526-49de-8461-8b077b1c688e-apiservice-cert\") pod \"packageserver-d55dfcdfc-l7qvj\" (UID: \"c50d9878-2526-49de-8461-8b077b1c688e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.708931 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-socket-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.709378 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5707c8e2-cffd-4f02-ab17-5eba261c57da-certs\") pod \"machine-config-server-47sdz\" (UID: \"5707c8e2-cffd-4f02-ab17-5eba261c57da\") " pod="openshift-machine-config-operator/machine-config-server-47sdz" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.709565 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89f9d155-58da-4221-ac74-ef11aa42fc6f-cert\") pod \"ingress-canary-w58c8\" (UID: \"89f9d155-58da-4221-ac74-ef11aa42fc6f\") " pod="openshift-ingress-canary/ingress-canary-w58c8" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.709937 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41d44cf9-3859-402a-9c77-d5842d7a70a3-secret-volume\") pod \"collect-profiles-29403270-9mdcq\" (UID: \"41d44cf9-3859-402a-9c77-d5842d7a70a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.710131 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/218622cd-c88c-44d8-b1f9-5090c8957212-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bv6bq\" (UID: \"218622cd-c88c-44d8-b1f9-5090c8957212\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.710224 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-mountpoint-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.710338 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/967a2b95-097f-4abc-b5eb-e6642b7f2e97-config\") pod \"service-ca-operator-777779d784-6x274\" (UID: \"967a2b95-097f-4abc-b5eb-e6642b7f2e97\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.710928 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d7abe47-9752-4760-bbbf-5dca233eeb30-trusted-ca\") pod \"ingress-operator-5b745b69d9-nt29c\" (UID: \"7d7abe47-9752-4760-bbbf-5dca233eeb30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.711150 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56952069-b3c4-4136-94c0-fe90a0e21fcd-config\") pod \"kube-controller-manager-operator-78b949d7b-vzqh8\" (UID: \"56952069-b3c4-4136-94c0-fe90a0e21fcd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.711396 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97073e75-5bda-4dc4-8d80-bf408068aaef-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rwq8l\" (UID: \"97073e75-5bda-4dc4-8d80-bf408068aaef\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.711521 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-default-certificate\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.711548 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4dc3d8f0-7538-4163-a3f9-b7e7daa055d2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-47bkz\" (UID: \"4dc3d8f0-7538-4163-a3f9-b7e7daa055d2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.711571 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-service-ca-bundle\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.711594 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwk5g\" (UniqueName: \"kubernetes.io/projected/0337812a-7694-45b8-92de-1740afe87f27-kube-api-access-kwk5g\") pod \"multus-admission-controller-857f4d67dd-4d4wh\" (UID: \"0337812a-7694-45b8-92de-1740afe87f27\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4d4wh" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.711616 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0337812a-7694-45b8-92de-1740afe87f27-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4d4wh\" (UID: \"0337812a-7694-45b8-92de-1740afe87f27\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4d4wh" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.711633 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-stats-auth\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.712441 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cfe3f7c6-e363-4dd7-a52a-e5d43134aba8-images\") pod \"machine-config-operator-74547568cd-htt4l\" (UID: \"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.712491 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52ab88e6-8cc7-4093-a550-570320f5e62b-metrics-tls\") pod \"dns-default-nzkzs\" (UID: \"52ab88e6-8cc7-4093-a550-570320f5e62b\") " pod="openshift-dns/dns-default-nzkzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.712651 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59-srv-cert\") pod \"olm-operator-6b444d44fb-dcs8r\" (UID: \"dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.713165 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52ab88e6-8cc7-4093-a550-570320f5e62b-config-volume\") pod \"dns-default-nzkzs\" (UID: \"52ab88e6-8cc7-4093-a550-570320f5e62b\") " pod="openshift-dns/dns-default-nzkzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.713440 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dcs8r\" (UID: \"dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.713658 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5707c8e2-cffd-4f02-ab17-5eba261c57da-node-bootstrap-token\") pod \"machine-config-server-47sdz\" (UID: \"5707c8e2-cffd-4f02-ab17-5eba261c57da\") " pod="openshift-machine-config-operator/machine-config-server-47sdz" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.713778 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cfe3f7c6-e363-4dd7-a52a-e5d43134aba8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-htt4l\" (UID: \"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.713884 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-plugins-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.713934 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4848d003-6c90-4cc0-ae0e-1563303c80db-srv-cert\") pod \"catalog-operator-68c6474976-k2f7k\" (UID: \"4848d003-6c90-4cc0-ae0e-1563303c80db\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.714749 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb7aec8-881b-46d2-a40d-0758d1141f55-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fkn42\" (UID: \"3bb7aec8-881b-46d2-a40d-0758d1141f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.714928 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86121519-d315-4eb4-9cd0-ab26bcd23b42-proxy-tls\") pod \"machine-config-controller-84d6567774-jc57b\" (UID: \"86121519-d315-4eb4-9cd0-ab26bcd23b42\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.715022 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfe3f7c6-e363-4dd7-a52a-e5d43134aba8-proxy-tls\") pod \"machine-config-operator-74547568cd-htt4l\" (UID: \"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.715165 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq8pf\" (UniqueName: \"kubernetes.io/projected/68c3e5aa-40f1-4600-951b-1c2460ed8466-kube-api-access-rq8pf\") pod \"migrator-59844c95c7-s2w2s\" (UID: \"68c3e5aa-40f1-4600-951b-1c2460ed8466\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s2w2s" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.715237 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb7aec8-881b-46d2-a40d-0758d1141f55-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fkn42\" (UID: \"3bb7aec8-881b-46d2-a40d-0758d1141f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.715255 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97073e75-5bda-4dc4-8d80-bf408068aaef-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rwq8l\" (UID: \"97073e75-5bda-4dc4-8d80-bf408068aaef\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.715286 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-registration-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.715390 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/967a2b95-097f-4abc-b5eb-e6642b7f2e97-serving-cert\") pod \"service-ca-operator-777779d784-6x274\" (UID: \"967a2b95-097f-4abc-b5eb-e6642b7f2e97\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.715795 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4dc3d8f0-7538-4163-a3f9-b7e7daa055d2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-47bkz\" (UID: \"4dc3d8f0-7538-4163-a3f9-b7e7daa055d2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.715944 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-csi-data-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.716040 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c50d9878-2526-49de-8461-8b077b1c688e-tmpfs\") pod \"packageserver-d55dfcdfc-l7qvj\" (UID: \"c50d9878-2526-49de-8461-8b077b1c688e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.716195 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-csi-data-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.716298 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8nwb\" (UniqueName: \"kubernetes.io/projected/5707c8e2-cffd-4f02-ab17-5eba261c57da-kube-api-access-b8nwb\") pod \"machine-config-server-47sdz\" (UID: \"5707c8e2-cffd-4f02-ab17-5eba261c57da\") " pod="openshift-machine-config-operator/machine-config-server-47sdz" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.716654 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpwvx\" (UniqueName: \"kubernetes.io/projected/967a2b95-097f-4abc-b5eb-e6642b7f2e97-kube-api-access-vpwvx\") pod \"service-ca-operator-777779d784-6x274\" (UID: \"967a2b95-097f-4abc-b5eb-e6642b7f2e97\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.716812 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c50d9878-2526-49de-8461-8b077b1c688e-tmpfs\") pod \"packageserver-d55dfcdfc-l7qvj\" (UID: \"c50d9878-2526-49de-8461-8b077b1c688e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.716821 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4848d003-6c90-4cc0-ae0e-1563303c80db-profile-collector-cert\") pod \"catalog-operator-68c6474976-k2f7k\" (UID: \"4848d003-6c90-4cc0-ae0e-1563303c80db\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.716900 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltxgs\" (UniqueName: \"kubernetes.io/projected/75f43961-be76-4bce-9d0f-9841bdd21c06-kube-api-access-ltxgs\") pod \"package-server-manager-789f6589d5-t7hhl\" (UID: \"75f43961-be76-4bce-9d0f-9841bdd21c06\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.716935 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4848d003-6c90-4cc0-ae0e-1563303c80db-srv-cert\") pod \"catalog-operator-68c6474976-k2f7k\" (UID: \"4848d003-6c90-4cc0-ae0e-1563303c80db\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.716992 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e547d918-c76b-4a9a-8717-ee227b89818d-registration-dir\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.717013 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bbadf2d2-48a3-499e-8dac-2ff8520cf311-signing-key\") pod \"service-ca-9c57cc56f-drtxl\" (UID: \"bbadf2d2-48a3-499e-8dac-2ff8520cf311\") " pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.717328 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c97b71a3-678a-485a-a890-508df1e45bdf-config\") pod \"kube-apiserver-operator-766d6c64bb-9qsl7\" (UID: \"c97b71a3-678a-485a-a890-508df1e45bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.717361 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb7aec8-881b-46d2-a40d-0758d1141f55-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fkn42\" (UID: \"3bb7aec8-881b-46d2-a40d-0758d1141f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.717386 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h4cg\" (UniqueName: \"kubernetes.io/projected/86121519-d315-4eb4-9cd0-ab26bcd23b42-kube-api-access-7h4cg\") pod \"machine-config-controller-84d6567774-jc57b\" (UID: \"86121519-d315-4eb4-9cd0-ab26bcd23b42\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.717411 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56952069-b3c4-4136-94c0-fe90a0e21fcd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vzqh8\" (UID: \"56952069-b3c4-4136-94c0-fe90a0e21fcd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.717433 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d7abe47-9752-4760-bbbf-5dca233eeb30-metrics-tls\") pod \"ingress-operator-5b745b69d9-nt29c\" (UID: \"7d7abe47-9752-4760-bbbf-5dca233eeb30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.717454 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swb99\" (UniqueName: \"kubernetes.io/projected/7d7abe47-9752-4760-bbbf-5dca233eeb30-kube-api-access-swb99\") pod \"ingress-operator-5b745b69d9-nt29c\" (UID: \"7d7abe47-9752-4760-bbbf-5dca233eeb30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.717492 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3bb7aec8-881b-46d2-a40d-0758d1141f55-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fkn42\" (UID: \"3bb7aec8-881b-46d2-a40d-0758d1141f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.717855 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c97b71a3-678a-485a-a890-508df1e45bdf-config\") pod \"kube-apiserver-operator-766d6c64bb-9qsl7\" (UID: \"c97b71a3-678a-485a-a890-508df1e45bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.718473 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0337812a-7694-45b8-92de-1740afe87f27-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4d4wh\" (UID: \"0337812a-7694-45b8-92de-1740afe87f27\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4d4wh" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.718847 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/967a2b95-097f-4abc-b5eb-e6642b7f2e97-serving-cert\") pod \"service-ca-operator-777779d784-6x274\" (UID: \"967a2b95-097f-4abc-b5eb-e6642b7f2e97\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.719141 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/75f43961-be76-4bce-9d0f-9841bdd21c06-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-t7hhl\" (UID: \"75f43961-be76-4bce-9d0f-9841bdd21c06\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.719675 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bbadf2d2-48a3-499e-8dac-2ff8520cf311-signing-key\") pod \"service-ca-9c57cc56f-drtxl\" (UID: \"bbadf2d2-48a3-499e-8dac-2ff8520cf311\") " pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.719940 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/218622cd-c88c-44d8-b1f9-5090c8957212-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bv6bq\" (UID: \"218622cd-c88c-44d8-b1f9-5090c8957212\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.720078 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c50d9878-2526-49de-8461-8b077b1c688e-webhook-cert\") pod \"packageserver-d55dfcdfc-l7qvj\" (UID: \"c50d9878-2526-49de-8461-8b077b1c688e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.720478 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-stats-auth\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.721261 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56952069-b3c4-4136-94c0-fe90a0e21fcd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vzqh8\" (UID: \"56952069-b3c4-4136-94c0-fe90a0e21fcd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.721445 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97073e75-5bda-4dc4-8d80-bf408068aaef-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rwq8l\" (UID: \"97073e75-5bda-4dc4-8d80-bf408068aaef\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.721683 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d7abe47-9752-4760-bbbf-5dca233eeb30-metrics-tls\") pod \"ingress-operator-5b745b69d9-nt29c\" (UID: \"7d7abe47-9752-4760-bbbf-5dca233eeb30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.722022 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb7aec8-881b-46d2-a40d-0758d1141f55-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fkn42\" (UID: \"3bb7aec8-881b-46d2-a40d-0758d1141f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.722337 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-default-certificate\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.724142 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-metrics-certs\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.726204 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bl7t\" (UniqueName: \"kubernetes.io/projected/e92e5871-b651-4219-ab37-6c973ea7fc92-kube-api-access-5bl7t\") pod \"apiserver-7bbb656c7d-9kn85\" (UID: \"e92e5871-b651-4219-ab37-6c973ea7fc92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.726290 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c97b71a3-678a-485a-a890-508df1e45bdf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9qsl7\" (UID: \"c97b71a3-678a-485a-a890-508df1e45bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.741353 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-72g4c"] Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.746094 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxg8f\" (UniqueName: \"kubernetes.io/projected/397c9d40-0280-4491-8aa1-2f97f28d0b9e-kube-api-access-hxg8f\") pod \"apiserver-76f77b778f-w5vzs\" (UID: \"397c9d40-0280-4491-8aa1-2f97f28d0b9e\") " pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: W1126 22:41:08.747444 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d2869e8_cbd3_43a7_b374_bf97ca15d5df.slice/crio-8a8de1cef1940452508e4c3a7a31bd3bb334236182fa9587bc6b4ea85742d4a7 WatchSource:0}: Error finding container 8a8de1cef1940452508e4c3a7a31bd3bb334236182fa9587bc6b4ea85742d4a7: Status 404 returned error can't find the container with id 8a8de1cef1940452508e4c3a7a31bd3bb334236182fa9587bc6b4ea85742d4a7 Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.764260 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-bound-sa-token\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.786012 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np7b2\" (UniqueName: \"kubernetes.io/projected/607a3fbf-3586-4c55-894a-f9107fc5679d-kube-api-access-np7b2\") pod \"controller-manager-879f6c89f-xsdh6\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.804884 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.805449 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n2sb\" (UniqueName: \"kubernetes.io/projected/c02a7188-7359-43e4-aed9-dd67d2b42875-kube-api-access-9n2sb\") pod \"machine-api-operator-5694c8668f-7rb5v\" (UID: \"c02a7188-7359-43e4-aed9-dd67d2b42875\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.807319 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.818103 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:08 crc kubenswrapper[5008]: E1126 22:41:08.818692 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.31867296 +0000 UTC m=+144.731366962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.822145 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.827619 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgzt9\" (UniqueName: \"kubernetes.io/projected/f9ac7305-8c6d-4fb3-900f-7c64bf930d38-kube-api-access-cgzt9\") pod \"machine-approver-56656f9798-vcl6w\" (UID: \"f9ac7305-8c6d-4fb3-900f-7c64bf930d38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.839196 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.847244 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whghj\" (UniqueName: \"kubernetes.io/projected/520338ff-81f2-44c4-9c0f-0d73bd42984c-kube-api-access-whghj\") pod \"openshift-controller-manager-operator-756b6f6bc6-pwq7l\" (UID: \"520338ff-81f2-44c4-9c0f-0d73bd42984c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.865266 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.865885 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rbfd\" (UniqueName: \"kubernetes.io/projected/743cf3d7-7296-4b87-94e3-67fc66dacca7-kube-api-access-6rbfd\") pod \"cluster-samples-operator-665b6dd947-p5xvf\" (UID: \"743cf3d7-7296-4b87-94e3-67fc66dacca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.878437 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.883549 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.886870 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq8sd\" (UniqueName: \"kubernetes.io/projected/17969c13-4b9e-4a72-b81f-7160db060271-kube-api-access-lq8sd\") pod \"etcd-operator-b45778765-dzzbw\" (UID: \"17969c13-4b9e-4a72-b81f-7160db060271\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.910644 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsbl5\" (UniqueName: \"kubernetes.io/projected/3e07175e-42fa-4065-adb1-3469e75ea4d8-kube-api-access-jsbl5\") pod \"console-f9d7485db-65m9h\" (UID: \"3e07175e-42fa-4065-adb1-3469e75ea4d8\") " pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.910913 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t29mr" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.919784 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: E1126 22:41:08.920112 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.42010021 +0000 UTC m=+144.832794212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.924467 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.929655 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54wwj\" (UniqueName: \"kubernetes.io/projected/4f56002e-5cd7-4eb0-9228-88e40e9a9942-kube-api-access-54wwj\") pod \"console-operator-58897d9998-qdkcv\" (UID: \"4f56002e-5cd7-4eb0-9228-88e40e9a9942\") " pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.945146 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-m4q9d" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.952871 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfrqw\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-kube-api-access-cfrqw\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.963538 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.970155 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grls8\" (UniqueName: \"kubernetes.io/projected/1c61407a-51aa-45ed-b80c-0376f72f01a8-kube-api-access-grls8\") pod \"cluster-image-registry-operator-dc59b4c8b-65x7g\" (UID: \"1c61407a-51aa-45ed-b80c-0376f72f01a8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.986696 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqf5x\" (UniqueName: \"kubernetes.io/projected/d57b4c5c-1f9c-4317-8912-dfc20269e1af-kube-api-access-xqf5x\") pod \"authentication-operator-69f744f599-j5n5x\" (UID: \"d57b4c5c-1f9c-4317-8912-dfc20269e1af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:08 crc kubenswrapper[5008]: I1126 22:41:08.998279 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.007524 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" Nov 26 22:41:09 crc kubenswrapper[5008]: W1126 22:41:09.017145 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23eb76e7_84b0_4e28_8efb_a0454bd41d1e.slice/crio-87f7ef166a4b8a65d75fc966b1daad67b903b570d9bdf25832e2b1990ba16989 WatchSource:0}: Error finding container 87f7ef166a4b8a65d75fc966b1daad67b903b570d9bdf25832e2b1990ba16989: Status 404 returned error can't find the container with id 87f7ef166a4b8a65d75fc966b1daad67b903b570d9bdf25832e2b1990ba16989 Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.020538 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.020685 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.520665416 +0000 UTC m=+144.933359418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.020814 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.021315 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.521303534 +0000 UTC m=+144.933997536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.027528 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlg9h\" (UniqueName: \"kubernetes.io/projected/bbadf2d2-48a3-499e-8dac-2ff8520cf311-kube-api-access-jlg9h\") pod \"service-ca-9c57cc56f-drtxl\" (UID: \"bbadf2d2-48a3-499e-8dac-2ff8520cf311\") " pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.044551 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.060094 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt88x\" (UniqueName: \"kubernetes.io/projected/dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59-kube-api-access-nt88x\") pod \"olm-operator-6b444d44fb-dcs8r\" (UID: \"dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.067727 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78fhd\" (UniqueName: \"kubernetes.io/projected/c50d9878-2526-49de-8461-8b077b1c688e-kube-api-access-78fhd\") pod \"packageserver-d55dfcdfc-l7qvj\" (UID: \"c50d9878-2526-49de-8461-8b077b1c688e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.075085 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.075297 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.085897 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c97b71a3-678a-485a-a890-508df1e45bdf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9qsl7\" (UID: \"c97b71a3-678a-485a-a890-508df1e45bdf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.114081 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76khw\" (UniqueName: \"kubernetes.io/projected/4dc3d8f0-7538-4163-a3f9-b7e7daa055d2-kube-api-access-76khw\") pod \"control-plane-machine-set-operator-78cbb6b69f-47bkz\" (UID: \"4dc3d8f0-7538-4163-a3f9-b7e7daa055d2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.121697 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.121885 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.621860399 +0000 UTC m=+145.034554401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.122289 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.122597 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.622584579 +0000 UTC m=+145.035278581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.129360 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.133652 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsds5\" (UniqueName: \"kubernetes.io/projected/41d44cf9-3859-402a-9c77-d5842d7a70a3-kube-api-access-zsds5\") pod \"collect-profiles-29403270-9mdcq\" (UID: \"41d44cf9-3859-402a-9c77-d5842d7a70a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.148819 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwsxn\" (UniqueName: \"kubernetes.io/projected/4848d003-6c90-4cc0-ae0e-1563303c80db-kube-api-access-bwsxn\") pod \"catalog-operator-68c6474976-k2f7k\" (UID: \"4848d003-6c90-4cc0-ae0e-1563303c80db\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.166505 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d7abe47-9752-4760-bbbf-5dca233eeb30-bound-sa-token\") pod \"ingress-operator-5b745b69d9-nt29c\" (UID: \"7d7abe47-9752-4760-bbbf-5dca233eeb30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.185956 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss6r8\" (UniqueName: \"kubernetes.io/projected/52ab88e6-8cc7-4093-a550-570320f5e62b-kube-api-access-ss6r8\") pod \"dns-default-nzkzs\" (UID: \"52ab88e6-8cc7-4093-a550-570320f5e62b\") " pod="openshift-dns/dns-default-nzkzs" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.191353 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.197448 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.204309 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.207918 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb8tk\" (UniqueName: \"kubernetes.io/projected/e547d918-c76b-4a9a-8717-ee227b89818d-kube-api-access-vb8tk\") pod \"csi-hostpathplugin-lbcxn\" (UID: \"e547d918-c76b-4a9a-8717-ee227b89818d\") " pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.224434 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.225019 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.724988066 +0000 UTC m=+145.137682068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.229391 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfz74\" (UniqueName: \"kubernetes.io/projected/218622cd-c88c-44d8-b1f9-5090c8957212-kube-api-access-zfz74\") pod \"kube-storage-version-migrator-operator-b67b599dd-bv6bq\" (UID: \"218622cd-c88c-44d8-b1f9-5090c8957212\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.252886 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56952069-b3c4-4136-94c0-fe90a0e21fcd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vzqh8\" (UID: \"56952069-b3c4-4136-94c0-fe90a0e21fcd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.257892 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.265135 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.267428 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7csd\" (UniqueName: \"kubernetes.io/projected/97073e75-5bda-4dc4-8d80-bf408068aaef-kube-api-access-s7csd\") pod \"marketplace-operator-79b997595-rwq8l\" (UID: \"97073e75-5bda-4dc4-8d80-bf408068aaef\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.288104 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ql8q\" (UniqueName: \"kubernetes.io/projected/cfe3f7c6-e363-4dd7-a52a-e5d43134aba8-kube-api-access-2ql8q\") pod \"machine-config-operator-74547568cd-htt4l\" (UID: \"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.292646 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.303459 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.309336 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.321145 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.324098 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrf7r\" (UniqueName: \"kubernetes.io/projected/89f9d155-58da-4221-ac74-ef11aa42fc6f-kube-api-access-lrf7r\") pod \"ingress-canary-w58c8\" (UID: \"89f9d155-58da-4221-ac74-ef11aa42fc6f\") " pod="openshift-ingress-canary/ingress-canary-w58c8" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.326311 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.327250 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.327529 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.827517427 +0000 UTC m=+145.240211419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.332951 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm2c2\" (UniqueName: \"kubernetes.io/projected/f8023be5-ae5d-4ca1-8541-6ec8d838e1df-kube-api-access-qm2c2\") pod \"router-default-5444994796-qhtsg\" (UID: \"f8023be5-ae5d-4ca1-8541-6ec8d838e1df\") " pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.334780 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xsdh6"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.354759 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.355742 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwk5g\" (UniqueName: \"kubernetes.io/projected/0337812a-7694-45b8-92de-1740afe87f27-kube-api-access-kwk5g\") pod \"multus-admission-controller-857f4d67dd-4d4wh\" (UID: \"0337812a-7694-45b8-92de-1740afe87f27\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4d4wh" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.361274 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4d4wh" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.363791 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" event={"ID":"e92e5871-b651-4219-ab37-6c973ea7fc92","Type":"ContainerStarted","Data":"4eaebfde4966945a9d432bda8699ae1f317407daca13882ba74e1db59f883bd8"} Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.365072 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" event={"ID":"f9ac7305-8c6d-4fb3-900f-7c64bf930d38","Type":"ContainerStarted","Data":"75ba9265806af9e830a6fba87f57e32b975ffb218fe1ccf3b4cb42dec3a113f0"} Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.369530 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq8pf\" (UniqueName: \"kubernetes.io/projected/68c3e5aa-40f1-4600-951b-1c2460ed8466-kube-api-access-rq8pf\") pod \"migrator-59844c95c7-s2w2s\" (UID: \"68c3e5aa-40f1-4600-951b-1c2460ed8466\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s2w2s" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.372285 5008 generic.go:334] "Generic (PLEG): container finished" podID="1d2869e8-cbd3-43a7-b374-bf97ca15d5df" containerID="f049c3f591845efe30ac8fcf85e67bc72127234c4f70a8fde9c817d1eac8e51f" exitCode=0 Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.372633 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" event={"ID":"1d2869e8-cbd3-43a7-b374-bf97ca15d5df","Type":"ContainerDied","Data":"f049c3f591845efe30ac8fcf85e67bc72127234c4f70a8fde9c817d1eac8e51f"} Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.372665 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" event={"ID":"1d2869e8-cbd3-43a7-b374-bf97ca15d5df","Type":"ContainerStarted","Data":"8a8de1cef1940452508e4c3a7a31bd3bb334236182fa9587bc6b4ea85742d4a7"} Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.379064 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.386592 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w58c8" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.387321 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8nwb\" (UniqueName: \"kubernetes.io/projected/5707c8e2-cffd-4f02-ab17-5eba261c57da-kube-api-access-b8nwb\") pod \"machine-config-server-47sdz\" (UID: \"5707c8e2-cffd-4f02-ab17-5eba261c57da\") " pod="openshift-machine-config-operator/machine-config-server-47sdz" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.397188 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.399194 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-w5vzs"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.410369 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nzkzs" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.419689 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" event={"ID":"23eb76e7-84b0-4e28-8efb-a0454bd41d1e","Type":"ContainerStarted","Data":"8ee9dc36c53f3d5248d1da9e164bc4e27d2459ae7def474178d8e045201596cd"} Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.419737 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" event={"ID":"23eb76e7-84b0-4e28-8efb-a0454bd41d1e","Type":"ContainerStarted","Data":"87f7ef166a4b8a65d75fc966b1daad67b903b570d9bdf25832e2b1990ba16989"} Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.419902 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.420659 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.426369 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.430736 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.430772 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltxgs\" (UniqueName: \"kubernetes.io/projected/75f43961-be76-4bce-9d0f-9841bdd21c06-kube-api-access-ltxgs\") pod \"package-server-manager-789f6589d5-t7hhl\" (UID: \"75f43961-be76-4bce-9d0f-9841bdd21c06\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.430941 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.930919771 +0000 UTC m=+145.343613773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.431080 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.432003 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:09.931994461 +0000 UTC m=+145.344688463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.449290 5008 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-9qz6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.449339 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" podUID="23eb76e7-84b0-4e28-8efb-a0454bd41d1e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.450517 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpwvx\" (UniqueName: \"kubernetes.io/projected/967a2b95-097f-4abc-b5eb-e6642b7f2e97-kube-api-access-vpwvx\") pod \"service-ca-operator-777779d784-6x274\" (UID: \"967a2b95-097f-4abc-b5eb-e6642b7f2e97\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.470421 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3bb7aec8-881b-46d2-a40d-0758d1141f55-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fkn42\" (UID: \"3bb7aec8-881b-46d2-a40d-0758d1141f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.472212 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swb99\" (UniqueName: \"kubernetes.io/projected/7d7abe47-9752-4760-bbbf-5dca233eeb30-kube-api-access-swb99\") pod \"ingress-operator-5b745b69d9-nt29c\" (UID: \"7d7abe47-9752-4760-bbbf-5dca233eeb30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.493807 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-drtxl"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.501345 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.501729 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h4cg\" (UniqueName: \"kubernetes.io/projected/86121519-d315-4eb4-9cd0-ab26bcd23b42-kube-api-access-7h4cg\") pod \"machine-config-controller-84d6567774-jc57b\" (UID: \"86121519-d315-4eb4-9cd0-ab26bcd23b42\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.517403 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m4q9d"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.536672 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7rb5v"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.536787 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dzzbw"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.536897 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.538386 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t29mr"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.539481 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.541905 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:10.041880116 +0000 UTC m=+145.454574118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.567189 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.569542 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k77kt"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.571449 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.578194 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.578348 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j5n5x"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.587259 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.609587 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s2w2s" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.630540 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.637880 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-47sdz" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.642006 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.642307 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:10.142290287 +0000 UTC m=+145.554984289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.670588 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.743476 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.743701 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:10.243675916 +0000 UTC m=+145.656369908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.743859 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.744310 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:10.244300174 +0000 UTC m=+145.656994246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.791087 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-65m9h"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.814520 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.816346 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qdkcv"] Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.844548 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.844918 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:10.34490063 +0000 UTC m=+145.757594632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:09 crc kubenswrapper[5008]: W1126 22:41:09.921474 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8023be5_ae5d_4ca1_8541_6ec8d838e1df.slice/crio-3af7669f0833bec9df9738ada9cb11e95e0042826a46df74e3f2a60f130b0407 WatchSource:0}: Error finding container 3af7669f0833bec9df9738ada9cb11e95e0042826a46df74e3f2a60f130b0407: Status 404 returned error can't find the container with id 3af7669f0833bec9df9738ada9cb11e95e0042826a46df74e3f2a60f130b0407 Nov 26 22:41:09 crc kubenswrapper[5008]: I1126 22:41:09.946850 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:09 crc kubenswrapper[5008]: E1126 22:41:09.947364 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:10.447348668 +0000 UTC m=+145.860042670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:10 crc kubenswrapper[5008]: W1126 22:41:10.047475 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f56002e_5cd7_4eb0_9228_88e40e9a9942.slice/crio-9b65a643d649c9e302dba736439a78571d23c586cae902dbd9e08a8e3a38657d WatchSource:0}: Error finding container 9b65a643d649c9e302dba736439a78571d23c586cae902dbd9e08a8e3a38657d: Status 404 returned error can't find the container with id 9b65a643d649c9e302dba736439a78571d23c586cae902dbd9e08a8e3a38657d Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.047745 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:10 crc kubenswrapper[5008]: E1126 22:41:10.047847 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:10.547829331 +0000 UTC m=+145.960523333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.048251 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:10 crc kubenswrapper[5008]: E1126 22:41:10.048702 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:10.548687145 +0000 UTC m=+145.961381147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.149062 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:10 crc kubenswrapper[5008]: E1126 22:41:10.149661 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:10.649647732 +0000 UTC m=+146.062341734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.208059 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.220761 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.222557 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.224018 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4d4wh"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.250084 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:10 crc kubenswrapper[5008]: E1126 22:41:10.250359 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:10.750348191 +0000 UTC m=+146.163042193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:10 crc kubenswrapper[5008]: W1126 22:41:10.282933 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc50d9878_2526_49de_8461_8b077b1c688e.slice/crio-28321f1d832b106676366232a4d4f87415e52ac0ea3168a9ae2e5af092cb2a65 WatchSource:0}: Error finding container 28321f1d832b106676366232a4d4f87415e52ac0ea3168a9ae2e5af092cb2a65: Status 404 returned error can't find the container with id 28321f1d832b106676366232a4d4f87415e52ac0ea3168a9ae2e5af092cb2a65 Nov 26 22:41:10 crc kubenswrapper[5008]: W1126 22:41:10.285868 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dc3d8f0_7538_4163_a3f9_b7e7daa055d2.slice/crio-68e7cfd92096f31f9dcc111137d3a45a5efdfc44731204bbc080fb29ca474f67 WatchSource:0}: Error finding container 68e7cfd92096f31f9dcc111137d3a45a5efdfc44731204bbc080fb29ca474f67: Status 404 returned error can't find the container with id 68e7cfd92096f31f9dcc111137d3a45a5efdfc44731204bbc080fb29ca474f67 Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.326893 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.350654 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:10 crc kubenswrapper[5008]: E1126 22:41:10.351280 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:10.851264536 +0000 UTC m=+146.263958528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:10 crc kubenswrapper[5008]: W1126 22:41:10.391653 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc97b71a3_678a_485a_a890_508df1e45bdf.slice/crio-d291581b2ea7db7dc27cd70c1563fa10df2599d15beb688876ce19acaeeb5ab5 WatchSource:0}: Error finding container d291581b2ea7db7dc27cd70c1563fa10df2599d15beb688876ce19acaeeb5ab5: Status 404 returned error can't find the container with id d291581b2ea7db7dc27cd70c1563fa10df2599d15beb688876ce19acaeeb5ab5 Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.402803 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w58c8"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.420672 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.434028 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rwq8l"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.436254 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.451057 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" event={"ID":"c50d9878-2526-49de-8461-8b077b1c688e","Type":"ContainerStarted","Data":"28321f1d832b106676366232a4d4f87415e52ac0ea3168a9ae2e5af092cb2a65"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.452506 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:10 crc kubenswrapper[5008]: E1126 22:41:10.452860 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:10.952846751 +0000 UTC m=+146.365540763 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.460937 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" event={"ID":"1d2869e8-cbd3-43a7-b374-bf97ca15d5df","Type":"ContainerStarted","Data":"d163e72fa6c298500fc073d08e2c074fe102556cb8f62e5745628bf0ecbc2bda"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.461348 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.461403 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.464081 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nzkzs"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.467570 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.468022 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" event={"ID":"f9ac7305-8c6d-4fb3-900f-7c64bf930d38","Type":"ContainerStarted","Data":"7bc1f1baa06733519f5c328fbbb002f607b11fe70c570510fe7b347811966b74"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.469261 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lbcxn"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.472733 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-47sdz" event={"ID":"5707c8e2-cffd-4f02-ab17-5eba261c57da","Type":"ContainerStarted","Data":"0b3665f12fb8f7e385a661a18c7789cecef1cdf94628c748b002012323bf1147"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.475087 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" event={"ID":"c97b71a3-678a-485a-a890-508df1e45bdf","Type":"ContainerStarted","Data":"d291581b2ea7db7dc27cd70c1563fa10df2599d15beb688876ce19acaeeb5ab5"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.478161 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t29mr" event={"ID":"6f72cb1c-2994-4ed4-8333-b7498c1615bf","Type":"ContainerStarted","Data":"d5cabb6f5b416e08c83138ff9ad0c4e402f6752eac32b72f4939e79ee3775035"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.478223 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t29mr" event={"ID":"6f72cb1c-2994-4ed4-8333-b7498c1615bf","Type":"ContainerStarted","Data":"be6ee74aa14c17695b5f63078afda9d974a89b90b6b30033776ac58153d30d09"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.478249 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-t29mr" Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.481908 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qhtsg" event={"ID":"f8023be5-ae5d-4ca1-8541-6ec8d838e1df","Type":"ContainerStarted","Data":"3af7669f0833bec9df9738ada9cb11e95e0042826a46df74e3f2a60f130b0407"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.484318 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" event={"ID":"17969c13-4b9e-4a72-b81f-7160db060271","Type":"ContainerStarted","Data":"aabe46cac6fe64aef8bfdaddb25d1ce3a537fde1eebc63ea14922516fe8fcb7a"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.484346 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.484385 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.489491 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-65m9h" event={"ID":"3e07175e-42fa-4065-adb1-3469e75ea4d8","Type":"ContainerStarted","Data":"54c317029905dff4b7d249c624744a1789fdf6e9d8a0c10f057f4b0df5cc3c18"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.510726 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qdkcv" event={"ID":"4f56002e-5cd7-4eb0-9228-88e40e9a9942","Type":"ContainerStarted","Data":"9b65a643d649c9e302dba736439a78571d23c586cae902dbd9e08a8e3a38657d"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.513726 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" event={"ID":"607a3fbf-3586-4c55-894a-f9107fc5679d","Type":"ContainerStarted","Data":"d0fe43aee1d2840e2217b9bf56dda5e3e93dd216d7e3dd79d5d5f6e4cdc9aed2"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.513763 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" event={"ID":"607a3fbf-3586-4c55-894a-f9107fc5679d","Type":"ContainerStarted","Data":"8f9e38af9bb0ca5c061f3d6533bad35a1b8ba2db7ba66ea3e4b60e4edb340e19"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.516035 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.521205 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" event={"ID":"1cf78e11-8113-4dd4-9b76-182452887bf3","Type":"ContainerStarted","Data":"e51ffbd72c01fdbdf6aa80194fbf271954ca3977b1042e1addd683fb79b360d7"} Nov 26 22:41:10 crc kubenswrapper[5008]: W1126 22:41:10.523031 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56952069_b3c4_4136_94c0_fe90a0e21fcd.slice/crio-52ebe8133396ef301975f01bff9213d33e9c615ba30d9868bf1a75ef3093db30 WatchSource:0}: Error finding container 52ebe8133396ef301975f01bff9213d33e9c615ba30d9868bf1a75ef3093db30: Status 404 returned error can't find the container with id 52ebe8133396ef301975f01bff9213d33e9c615ba30d9868bf1a75ef3093db30 Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.525730 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" event={"ID":"bbadf2d2-48a3-499e-8dac-2ff8520cf311","Type":"ContainerStarted","Data":"aacb4bdd526b1257b3a3b0d86f23a27e20fe8af7a87699ea580e2d8608589e14"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.525768 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" event={"ID":"bbadf2d2-48a3-499e-8dac-2ff8520cf311","Type":"ContainerStarted","Data":"9c633a9f763cd158b7ed5604dbd97a66b7c13b871772f9b7eaf0356985fa26ec"} Nov 26 22:41:10 crc kubenswrapper[5008]: W1126 22:41:10.530863 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfe0f94c_3d84_40e6_8ef0_8ac4d2d13f59.slice/crio-4dd95a0706cee66d6fd7e7e4fd70779779421445c1dccae2a817019077b155d5 WatchSource:0}: Error finding container 4dd95a0706cee66d6fd7e7e4fd70779779421445c1dccae2a817019077b155d5: Status 404 returned error can't find the container with id 4dd95a0706cee66d6fd7e7e4fd70779779421445c1dccae2a817019077b155d5 Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.531419 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" event={"ID":"1c61407a-51aa-45ed-b80c-0376f72f01a8","Type":"ContainerStarted","Data":"61fa952ee7fcc75854060a97a4bef6a8faf810da393a0999602af0f9abb1e1b9"} Nov 26 22:41:10 crc kubenswrapper[5008]: W1126 22:41:10.535155 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d7abe47_9752_4760_bbbf_5dca233eeb30.slice/crio-052fbe7dfbbbe2febae9d679173df607cfa5f104dadfc700892bafc9b2f4f015 WatchSource:0}: Error finding container 052fbe7dfbbbe2febae9d679173df607cfa5f104dadfc700892bafc9b2f4f015: Status 404 returned error can't find the container with id 052fbe7dfbbbe2febae9d679173df607cfa5f104dadfc700892bafc9b2f4f015 Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.536372 5008 generic.go:334] "Generic (PLEG): container finished" podID="397c9d40-0280-4491-8aa1-2f97f28d0b9e" containerID="abffd3fdae9e8f4af09715cf93ad0a7dc842f84c46fe9da960f3d1bf8b6608b7" exitCode=0 Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.536425 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" event={"ID":"397c9d40-0280-4491-8aa1-2f97f28d0b9e","Type":"ContainerDied","Data":"abffd3fdae9e8f4af09715cf93ad0a7dc842f84c46fe9da960f3d1bf8b6608b7"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.536451 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" event={"ID":"397c9d40-0280-4491-8aa1-2f97f28d0b9e","Type":"ContainerStarted","Data":"7905028c1fd06f2b3e3c7f17161480a0d61959d129cefbc282c720c5e8a56523"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.537229 5008 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xsdh6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.537256 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" podUID="607a3fbf-3586-4c55-894a-f9107fc5679d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.540799 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" event={"ID":"c02a7188-7359-43e4-aed9-dd67d2b42875","Type":"ContainerStarted","Data":"10da262749e0f3f1f4a28e07c1d758afe79507f0bd31e57f30cc503a9aee15d0"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.540860 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" event={"ID":"c02a7188-7359-43e4-aed9-dd67d2b42875","Type":"ContainerStarted","Data":"7c58046bb7464eb53d19ab3a7896b9569a4b19f9be7c6fbcf7befddf7c095c18"} Nov 26 22:41:10 crc kubenswrapper[5008]: W1126 22:41:10.544662 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97073e75_5bda_4dc4_8d80_bf408068aaef.slice/crio-0aab8801d460828a58c4946b04438e5184f808f3343d13d742b94331354c5084 WatchSource:0}: Error finding container 0aab8801d460828a58c4946b04438e5184f808f3343d13d742b94331354c5084: Status 404 returned error can't find the container with id 0aab8801d460828a58c4946b04438e5184f808f3343d13d742b94331354c5084 Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.548665 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" event={"ID":"e2e26305-8405-49e7-a5f6-611f769851d6","Type":"ContainerStarted","Data":"129b1c40b8d9e706f2b002f66bff9c022a12f4547ede537c2a7fa6c023f5f9f2"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.548720 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" event={"ID":"e2e26305-8405-49e7-a5f6-611f769851d6","Type":"ContainerStarted","Data":"6bf9783b345e35230d55dcba28b9258138ff7dadfc5dfd789556ac025879c12c"} Nov 26 22:41:10 crc kubenswrapper[5008]: W1126 22:41:10.552691 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52ab88e6_8cc7_4093_a550_570320f5e62b.slice/crio-5860c5656ffae8fab0bd504ff769a16acef308237bba56bb77ae13488f5be140 WatchSource:0}: Error finding container 5860c5656ffae8fab0bd504ff769a16acef308237bba56bb77ae13488f5be140: Status 404 returned error can't find the container with id 5860c5656ffae8fab0bd504ff769a16acef308237bba56bb77ae13488f5be140 Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.554789 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4d4wh" event={"ID":"0337812a-7694-45b8-92de-1740afe87f27","Type":"ContainerStarted","Data":"a3218b236687569eab05b42617b4831100ea16feaac584f1507ec2b2088f3e6c"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.555843 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:10 crc kubenswrapper[5008]: E1126 22:41:10.559408 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:11.059377103 +0000 UTC m=+146.472071105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.560690 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.561398 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz" event={"ID":"4dc3d8f0-7538-4163-a3f9-b7e7daa055d2","Type":"ContainerStarted","Data":"68e7cfd92096f31f9dcc111137d3a45a5efdfc44731204bbc080fb29ca474f67"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.573440 5008 generic.go:334] "Generic (PLEG): container finished" podID="e92e5871-b651-4219-ab37-6c973ea7fc92" containerID="a33cf36664ddfc0d750dd377c6a7555462a6f475e5d8e4fe45bee4fd209033d0" exitCode=0 Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.573564 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" event={"ID":"e92e5871-b651-4219-ab37-6c973ea7fc92","Type":"ContainerDied","Data":"a33cf36664ddfc0d750dd377c6a7555462a6f475e5d8e4fe45bee4fd209033d0"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.579036 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" event={"ID":"d57b4c5c-1f9c-4317-8912-dfc20269e1af","Type":"ContainerStarted","Data":"a2ec6b8076b2f07fc12c2991870a23a0ccece1221c94edf0f5429dcdb2aa995f"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.580017 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" event={"ID":"d57b4c5c-1f9c-4317-8912-dfc20269e1af","Type":"ContainerStarted","Data":"ad0bf6653eae0331462a0a53daf063d953ef58fe2929d204685803470d7e7053"} Nov 26 22:41:10 crc kubenswrapper[5008]: W1126 22:41:10.583706 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86121519_d315_4eb4_9cd0_ab26bcd23b42.slice/crio-65d29807ce2ace02c57d2105e42cf3f18bc151ff5e7215118ffbc38cb5615538 WatchSource:0}: Error finding container 65d29807ce2ace02c57d2105e42cf3f18bc151ff5e7215118ffbc38cb5615538: Status 404 returned error can't find the container with id 65d29807ce2ace02c57d2105e42cf3f18bc151ff5e7215118ffbc38cb5615538 Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.591268 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.592411 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" event={"ID":"743cf3d7-7296-4b87-94e3-67fc66dacca7","Type":"ContainerStarted","Data":"537251dabc7a71c7b17804c1a6fdc6b6577da5d2bda726d33bfdaa7e899b79b1"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.592459 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" event={"ID":"743cf3d7-7296-4b87-94e3-67fc66dacca7","Type":"ContainerStarted","Data":"cf945af06639b36f539678a2301f0178a21d179ea6c96aeb0ae0b3c37b7b0adf"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.600276 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" event={"ID":"520338ff-81f2-44c4-9c0f-0d73bd42984c","Type":"ContainerStarted","Data":"9621f4f9cdcd96ac4a76ac5420b0ccf2e8394f286a376d8ca5a7326ff3485cbc"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.600311 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" event={"ID":"520338ff-81f2-44c4-9c0f-0d73bd42984c","Type":"ContainerStarted","Data":"c295a7e8b5fba0a559f14074174c3d69c92f7cc388c7df746ab63e0bcd74b31c"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.603527 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m4q9d" event={"ID":"041c5cf5-31c5-436f-b5c2-847ee3f3708c","Type":"ContainerStarted","Data":"bce79cada48432b54beb3c6ad966ea463fde19a95e00ea71966bb2dda605c20b"} Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.628215 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6x274"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.671531 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:10 crc kubenswrapper[5008]: E1126 22:41:10.673550 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:11.173536687 +0000 UTC m=+146.586230689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.696292 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.702095 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.713863 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s2w2s"] Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.733807 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl"] Nov 26 22:41:10 crc kubenswrapper[5008]: W1126 22:41:10.738579 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod967a2b95_097f_4abc_b5eb_e6642b7f2e97.slice/crio-87ba1d72fccec5eaa04a5b3064dda00d2030a69ca931979356b0ecf27e580919 WatchSource:0}: Error finding container 87ba1d72fccec5eaa04a5b3064dda00d2030a69ca931979356b0ecf27e580919: Status 404 returned error can't find the container with id 87ba1d72fccec5eaa04a5b3064dda00d2030a69ca931979356b0ecf27e580919 Nov 26 22:41:10 crc kubenswrapper[5008]: W1126 22:41:10.749342 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68c3e5aa_40f1_4600_951b_1c2460ed8466.slice/crio-43d53c02bc2334edf2e52a1ad10d9091612617526cdbcb77705d49361b03c978 WatchSource:0}: Error finding container 43d53c02bc2334edf2e52a1ad10d9091612617526cdbcb77705d49361b03c978: Status 404 returned error can't find the container with id 43d53c02bc2334edf2e52a1ad10d9091612617526cdbcb77705d49361b03c978 Nov 26 22:41:10 crc kubenswrapper[5008]: W1126 22:41:10.753612 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75f43961_be76_4bce_9d0f_9841bdd21c06.slice/crio-5cda43b8151a25c61991e92a1d224c6195a9a91926765634a5a203b3c1f476f4 WatchSource:0}: Error finding container 5cda43b8151a25c61991e92a1d224c6195a9a91926765634a5a203b3c1f476f4: Status 404 returned error can't find the container with id 5cda43b8151a25c61991e92a1d224c6195a9a91926765634a5a203b3c1f476f4 Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.766855 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.772581 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:10 crc kubenswrapper[5008]: E1126 22:41:10.772923 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:11.272900149 +0000 UTC m=+146.685594151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:10 crc kubenswrapper[5008]: E1126 22:41:10.878279 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:11.378263249 +0000 UTC m=+146.790957261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.879598 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.981079 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:10 crc kubenswrapper[5008]: E1126 22:41:10.981859 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:11.481819248 +0000 UTC m=+146.894513250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:10 crc kubenswrapper[5008]: I1126 22:41:10.988616 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-t29mr" podStartSLOduration=123.988592456 podStartE2EDuration="2m3.988592456s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:10.984025279 +0000 UTC m=+146.396719281" watchObservedRunningTime="2025-11-26 22:41:10.988592456 +0000 UTC m=+146.401286468" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.067937 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" podStartSLOduration=124.067915789 podStartE2EDuration="2m4.067915789s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.027863652 +0000 UTC m=+146.440557784" watchObservedRunningTime="2025-11-26 22:41:11.067915789 +0000 UTC m=+146.480609791" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.068825 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pwq7l" podStartSLOduration=124.068794214 podStartE2EDuration="2m4.068794214s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.064292378 +0000 UTC m=+146.476986380" watchObservedRunningTime="2025-11-26 22:41:11.068794214 +0000 UTC m=+146.481488216" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.083036 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:11 crc kubenswrapper[5008]: E1126 22:41:11.083374 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:11.583357951 +0000 UTC m=+146.996052013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.114445 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" podStartSLOduration=124.114428847 podStartE2EDuration="2m4.114428847s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.113376388 +0000 UTC m=+146.526070390" watchObservedRunningTime="2025-11-26 22:41:11.114428847 +0000 UTC m=+146.527122849" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.145214 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" podStartSLOduration=124.145197425 podStartE2EDuration="2m4.145197425s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.143764665 +0000 UTC m=+146.556458667" watchObservedRunningTime="2025-11-26 22:41:11.145197425 +0000 UTC m=+146.557891427" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.184615 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:11 crc kubenswrapper[5008]: E1126 22:41:11.184985 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:11.684954104 +0000 UTC m=+147.097648096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.225588 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rz4wr" podStartSLOduration=124.225570228 podStartE2EDuration="2m4.225570228s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.182162996 +0000 UTC m=+146.594856988" watchObservedRunningTime="2025-11-26 22:41:11.225570228 +0000 UTC m=+146.638264230" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.264183 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-j5n5x" podStartSLOduration=124.264165574 podStartE2EDuration="2m4.264165574s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.223719676 +0000 UTC m=+146.636413698" watchObservedRunningTime="2025-11-26 22:41:11.264165574 +0000 UTC m=+146.676859576" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.286720 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:11 crc kubenswrapper[5008]: E1126 22:41:11.287233 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:11.787219538 +0000 UTC m=+147.199913540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.319352 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-drtxl" podStartSLOduration=124.319336333 podStartE2EDuration="2m4.319336333s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.26722755 +0000 UTC m=+146.679921542" watchObservedRunningTime="2025-11-26 22:41:11.319336333 +0000 UTC m=+146.732030335" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.387958 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:11 crc kubenswrapper[5008]: E1126 22:41:11.388102 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:11.888080132 +0000 UTC m=+147.300774134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.388343 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:11 crc kubenswrapper[5008]: E1126 22:41:11.388628 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:11.888619946 +0000 UTC m=+147.301313948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.489991 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:11 crc kubenswrapper[5008]: E1126 22:41:11.490173 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:11.990148968 +0000 UTC m=+147.402842970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.491070 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:11 crc kubenswrapper[5008]: E1126 22:41:11.491431 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:11.991415574 +0000 UTC m=+147.404109646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.592451 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:11 crc kubenswrapper[5008]: E1126 22:41:11.593213 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:12.093186513 +0000 UTC m=+147.505880515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.593412 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:11 crc kubenswrapper[5008]: E1126 22:41:11.593743 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:12.093730018 +0000 UTC m=+147.506424010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.630037 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" event={"ID":"56952069-b3c4-4136-94c0-fe90a0e21fcd","Type":"ContainerStarted","Data":"98378de028cd32f2a54e203f6e377dee318ae5c805619f83a84052c0e997b548"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.630080 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" event={"ID":"56952069-b3c4-4136-94c0-fe90a0e21fcd","Type":"ContainerStarted","Data":"52ebe8133396ef301975f01bff9213d33e9c615ba30d9868bf1a75ef3093db30"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.646420 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzqh8" podStartSLOduration=124.646401877 podStartE2EDuration="2m4.646401877s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.646164611 +0000 UTC m=+147.058858613" watchObservedRunningTime="2025-11-26 22:41:11.646401877 +0000 UTC m=+147.059095879" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.652389 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" event={"ID":"17969c13-4b9e-4a72-b81f-7160db060271","Type":"ContainerStarted","Data":"28f492fa000c42473c8a7dfa67ae0273e42e699daaa4209f2e5569fcde9a4109"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.666509 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-dzzbw" podStartSLOduration=124.666488218 podStartE2EDuration="2m4.666488218s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.666038925 +0000 UTC m=+147.078732947" watchObservedRunningTime="2025-11-26 22:41:11.666488218 +0000 UTC m=+147.079182230" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.681203 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" event={"ID":"f9ac7305-8c6d-4fb3-900f-7c64bf930d38","Type":"ContainerStarted","Data":"cde4b3d98307da72b789c2de07903b8ba4b5040c63d2f860ec22221f05f13361"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.689169 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" event={"ID":"c50d9878-2526-49de-8461-8b077b1c688e","Type":"ContainerStarted","Data":"527e45c1a306992e4a4410a6e8050eacde0024907ba27ca5826e15bdb4bc2335"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.689676 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.694024 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.695803 5008 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-l7qvj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.695861 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" podUID="c50d9878-2526-49de-8461-8b077b1c688e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Nov 26 22:41:11 crc kubenswrapper[5008]: E1126 22:41:11.695957 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:12.195910279 +0000 UTC m=+147.608604281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.701200 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vcl6w" podStartSLOduration=124.701178486 podStartE2EDuration="2m4.701178486s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.695383704 +0000 UTC m=+147.108077706" watchObservedRunningTime="2025-11-26 22:41:11.701178486 +0000 UTC m=+147.113872488" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.714200 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m4q9d" event={"ID":"041c5cf5-31c5-436f-b5c2-847ee3f3708c","Type":"ContainerStarted","Data":"06487ead25ef2679aaf60646ecd81cc356ddf725f57afb4a56e02e38f1105ebf"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.716252 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" podStartSLOduration=124.716233565 podStartE2EDuration="2m4.716233565s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.714788856 +0000 UTC m=+147.127482878" watchObservedRunningTime="2025-11-26 22:41:11.716233565 +0000 UTC m=+147.128927567" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.716837 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" event={"ID":"3bb7aec8-881b-46d2-a40d-0758d1141f55","Type":"ContainerStarted","Data":"77f49f2361c2d7afdb17a4374a09eef6e96efce042a0d0b1a451438d35439059"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.719599 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" event={"ID":"743cf3d7-7296-4b87-94e3-67fc66dacca7","Type":"ContainerStarted","Data":"6c31ba1c7a2d3f3538351c94b9c79762362c822e2627f8efc38f321ed69de3df"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.725155 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s2w2s" event={"ID":"68c3e5aa-40f1-4600-951b-1c2460ed8466","Type":"ContainerStarted","Data":"43d53c02bc2334edf2e52a1ad10d9091612617526cdbcb77705d49361b03c978"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.727366 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" event={"ID":"dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59","Type":"ContainerStarted","Data":"f643258cf7b1bd7af746e2286fa52da6c5fa1a3bc93cffb1d11d78ed3f69e2a2"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.727404 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" event={"ID":"dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59","Type":"ContainerStarted","Data":"4dd95a0706cee66d6fd7e7e4fd70779779421445c1dccae2a817019077b155d5"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.728123 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.730012 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" event={"ID":"1c61407a-51aa-45ed-b80c-0376f72f01a8","Type":"ContainerStarted","Data":"561523b288dbc6948bee7075847468ccd206cfbb899a2e74b7aed43dba8b0923"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.737626 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" event={"ID":"218622cd-c88c-44d8-b1f9-5090c8957212","Type":"ContainerStarted","Data":"23d18faa9118647b8d19c70f847cd3ef7becfbabdeb9dbd52af7786d64a21eb6"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.737801 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" event={"ID":"218622cd-c88c-44d8-b1f9-5090c8957212","Type":"ContainerStarted","Data":"4d96f47cd92f5e875bc26d61b47fc6b29b6a890f79d185989736b140a5680428"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.755681 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" podStartSLOduration=124.755656875 podStartE2EDuration="2m4.755656875s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.74611659 +0000 UTC m=+147.158810612" watchObservedRunningTime="2025-11-26 22:41:11.755656875 +0000 UTC m=+147.168350877" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.758161 5008 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dcs8r container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.758200 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" podUID="dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.761048 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nzkzs" event={"ID":"52ab88e6-8cc7-4093-a550-570320f5e62b","Type":"ContainerStarted","Data":"5860c5656ffae8fab0bd504ff769a16acef308237bba56bb77ae13488f5be140"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.771399 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" event={"ID":"97073e75-5bda-4dc4-8d80-bf408068aaef","Type":"ContainerStarted","Data":"c8df45a267dc1bcbf7bf592bb2f4358771c356d549006d2d30985fa9e3171238"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.771445 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" event={"ID":"97073e75-5bda-4dc4-8d80-bf408068aaef","Type":"ContainerStarted","Data":"0aab8801d460828a58c4946b04438e5184f808f3343d13d742b94331354c5084"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.772329 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.777648 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w58c8" event={"ID":"89f9d155-58da-4221-ac74-ef11aa42fc6f","Type":"ContainerStarted","Data":"8eea6b119ca6130b3d9852968637adbab4cb5d75b255dfe49a01b4b1949e3faa"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.777690 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w58c8" event={"ID":"89f9d155-58da-4221-ac74-ef11aa42fc6f","Type":"ContainerStarted","Data":"b147781b1f39180bf425429e984155f6af5fd33f522498ef9eb6bce481a8e8ee"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.781184 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" event={"ID":"1cf78e11-8113-4dd4-9b76-182452887bf3","Type":"ContainerStarted","Data":"3cdd410a22c15727acd0d55677a49415abc4b83968d2b0c70973b21d2cd9c267"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.781696 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.788702 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" podStartSLOduration=124.788681057 podStartE2EDuration="2m4.788681057s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.786939538 +0000 UTC m=+147.199633540" watchObservedRunningTime="2025-11-26 22:41:11.788681057 +0000 UTC m=+147.201375069" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.791660 5008 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rwq8l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.791708 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" podUID="97073e75-5bda-4dc4-8d80-bf408068aaef" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.792204 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" event={"ID":"967a2b95-097f-4abc-b5eb-e6642b7f2e97","Type":"ContainerStarted","Data":"87ba1d72fccec5eaa04a5b3064dda00d2030a69ca931979356b0ecf27e580919"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.794192 5008 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-k77kt container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" start-of-body= Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.794246 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" podUID="1cf78e11-8113-4dd4-9b76-182452887bf3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.796365 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.797300 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-47sdz" event={"ID":"5707c8e2-cffd-4f02-ab17-5eba261c57da","Type":"ContainerStarted","Data":"7ec46434d026df39119eace8c4a6d2f5e18fb0bd1aec337b0df2bcfbc3052a67"} Nov 26 22:41:11 crc kubenswrapper[5008]: E1126 22:41:11.800325 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:12.300308031 +0000 UTC m=+147.713002073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.806858 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" event={"ID":"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8","Type":"ContainerStarted","Data":"414d1990c389482d8a8f6b46abd05d19a44e4f4d4a1748104fa4d12813146ea7"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.808476 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4d4wh" event={"ID":"0337812a-7694-45b8-92de-1740afe87f27","Type":"ContainerStarted","Data":"c299ad62647527e6ebdcbd2d07ab7c27b1e48476289003c9844f7f8fe2634f83"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.809541 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" event={"ID":"7d7abe47-9752-4760-bbbf-5dca233eeb30","Type":"ContainerStarted","Data":"99da0460cd8908144aae88e475dfd5d794703f83b922e2f244875367a1838167"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.809564 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" event={"ID":"7d7abe47-9752-4760-bbbf-5dca233eeb30","Type":"ContainerStarted","Data":"052fbe7dfbbbe2febae9d679173df607cfa5f104dadfc700892bafc9b2f4f015"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.811170 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" event={"ID":"4848d003-6c90-4cc0-ae0e-1563303c80db","Type":"ContainerStarted","Data":"fec69f5de637fd5acb6eb850e7e4ef08b4c7eaf63e209582427daf66824d072d"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.811228 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" event={"ID":"4848d003-6c90-4cc0-ae0e-1563303c80db","Type":"ContainerStarted","Data":"49bf6fede2bfe335f37731498b61f7be3306652a64da72572e523a7adb32acf1"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.812445 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.813438 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65x7g" podStartSLOduration=124.813425988 podStartE2EDuration="2m4.813425988s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.812135151 +0000 UTC m=+147.224829153" watchObservedRunningTime="2025-11-26 22:41:11.813425988 +0000 UTC m=+147.226120000" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.814144 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-65m9h" event={"ID":"3e07175e-42fa-4065-adb1-3469e75ea4d8","Type":"ContainerStarted","Data":"d5550aaf4743456fcb1eae8d3ccbac64332d2d9ba5864095afc03684407f4f63"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.815708 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qdkcv" event={"ID":"4f56002e-5cd7-4eb0-9228-88e40e9a9942","Type":"ContainerStarted","Data":"b2038478aa088fd85a3650c69b176e3d6ee3efdf7f455793b4f3710469996896"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.816386 5008 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-k2f7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.816404 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.816420 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" podUID="4848d003-6c90-4cc0-ae0e-1563303c80db" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.821914 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz" event={"ID":"4dc3d8f0-7538-4163-a3f9-b7e7daa055d2","Type":"ContainerStarted","Data":"596c1749d2ae14d672881926dfcc861aa8cf0f6b60ab02a91f3fa6505573fd89"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.824266 5008 patch_prober.go:28] interesting pod/console-operator-58897d9998-qdkcv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.824311 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qdkcv" podUID="4f56002e-5cd7-4eb0-9228-88e40e9a9942" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.826082 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" event={"ID":"86121519-d315-4eb4-9cd0-ab26bcd23b42","Type":"ContainerStarted","Data":"6b4a2510c9d25117b14a9fd4cd154c66abec39bd0b9561ab67236ee3ed785929"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.826133 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" event={"ID":"86121519-d315-4eb4-9cd0-ab26bcd23b42","Type":"ContainerStarted","Data":"65d29807ce2ace02c57d2105e42cf3f18bc151ff5e7215118ffbc38cb5615538"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.830266 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" event={"ID":"75f43961-be76-4bce-9d0f-9841bdd21c06","Type":"ContainerStarted","Data":"5cda43b8151a25c61991e92a1d224c6195a9a91926765634a5a203b3c1f476f4"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.831301 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" event={"ID":"e547d918-c76b-4a9a-8717-ee227b89818d","Type":"ContainerStarted","Data":"36d47d995b1be305d35bb7976b133151d6ffc992297aeef0bcc6fda118f3ed74"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.832714 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bv6bq" podStartSLOduration=124.832701405 podStartE2EDuration="2m4.832701405s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.832592222 +0000 UTC m=+147.245286224" watchObservedRunningTime="2025-11-26 22:41:11.832701405 +0000 UTC m=+147.245395407" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.843389 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" event={"ID":"c02a7188-7359-43e4-aed9-dd67d2b42875","Type":"ContainerStarted","Data":"171b1604380a9709763c2a24d8f06ed99f417e19af3edf62fc7b9c46cf433f98"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.857564 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" podStartSLOduration=124.857540818 podStartE2EDuration="2m4.857540818s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.856319034 +0000 UTC m=+147.269013046" watchObservedRunningTime="2025-11-26 22:41:11.857540818 +0000 UTC m=+147.270234820" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.867353 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" event={"ID":"41d44cf9-3859-402a-9c77-d5842d7a70a3","Type":"ContainerStarted","Data":"e11940794cb0d696f716a6fe2e3050feb2f4908c675e752f84c93fe59fc6a6d9"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.867406 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" event={"ID":"41d44cf9-3859-402a-9c77-d5842d7a70a3","Type":"ContainerStarted","Data":"7213b4eef1442e70f8d843bac4afd411dc746b051582a446b08baa3a4ebc4eb8"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.879887 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qhtsg" event={"ID":"f8023be5-ae5d-4ca1-8541-6ec8d838e1df","Type":"ContainerStarted","Data":"707b4bf99a741a87118077a33e40df98fe2604d584e8143d49bc6d1d0ff78ab6"} Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.881710 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.881761 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.882488 5008 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xsdh6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.882532 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" podUID="607a3fbf-3586-4c55-894a-f9107fc5679d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.898039 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:11 crc kubenswrapper[5008]: E1126 22:41:11.898253 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:12.398225843 +0000 UTC m=+147.810919895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.898925 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:11 crc kubenswrapper[5008]: E1126 22:41:11.900853 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:12.400839856 +0000 UTC m=+147.813533938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.903823 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" podStartSLOduration=124.903810378 podStartE2EDuration="2m4.903810378s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.90310991 +0000 UTC m=+147.315803932" watchObservedRunningTime="2025-11-26 22:41:11.903810378 +0000 UTC m=+147.316504380" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.904562 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-qdkcv" podStartSLOduration=124.90455858 podStartE2EDuration="2m4.90455858s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.881536647 +0000 UTC m=+147.294230659" watchObservedRunningTime="2025-11-26 22:41:11.90455858 +0000 UTC m=+147.317252582" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.928312 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" podStartSLOduration=124.928294832 podStartE2EDuration="2m4.928294832s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.926911723 +0000 UTC m=+147.339605725" watchObservedRunningTime="2025-11-26 22:41:11.928294832 +0000 UTC m=+147.340988834" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.952312 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-w58c8" podStartSLOduration=5.952288691 podStartE2EDuration="5.952288691s" podCreationTimestamp="2025-11-26 22:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.949309298 +0000 UTC m=+147.362003300" watchObservedRunningTime="2025-11-26 22:41:11.952288691 +0000 UTC m=+147.364982693" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.978118 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-47bkz" podStartSLOduration=124.978099971 podStartE2EDuration="2m4.978099971s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.975771017 +0000 UTC m=+147.388465019" watchObservedRunningTime="2025-11-26 22:41:11.978099971 +0000 UTC m=+147.390793973" Nov 26 22:41:11 crc kubenswrapper[5008]: I1126 22:41:11.995086 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-47sdz" podStartSLOduration=5.995065975 podStartE2EDuration="5.995065975s" podCreationTimestamp="2025-11-26 22:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:11.992317998 +0000 UTC m=+147.405012000" watchObservedRunningTime="2025-11-26 22:41:11.995065975 +0000 UTC m=+147.407759977" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.002499 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.004169 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:12.504144308 +0000 UTC m=+147.916838310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.032127 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-65m9h" podStartSLOduration=125.032099998 podStartE2EDuration="2m5.032099998s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:12.025239136 +0000 UTC m=+147.437933158" watchObservedRunningTime="2025-11-26 22:41:12.032099998 +0000 UTC m=+147.444794000" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.059513 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" podStartSLOduration=125.059493992 podStartE2EDuration="2m5.059493992s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:12.052903238 +0000 UTC m=+147.465597240" watchObservedRunningTime="2025-11-26 22:41:12.059493992 +0000 UTC m=+147.472187994" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.106941 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.107298 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:12.607287216 +0000 UTC m=+148.019981218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.114006 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7rb5v" podStartSLOduration=125.113944031 podStartE2EDuration="2m5.113944031s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:12.111621596 +0000 UTC m=+147.524315608" watchObservedRunningTime="2025-11-26 22:41:12.113944031 +0000 UTC m=+147.526638033" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.114660 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-qhtsg" podStartSLOduration=125.114653361 podStartE2EDuration="2m5.114653361s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:12.085825196 +0000 UTC m=+147.498519208" watchObservedRunningTime="2025-11-26 22:41:12.114653361 +0000 UTC m=+147.527347363" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.208732 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.209053 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:12.709035254 +0000 UTC m=+148.121729246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.209299 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.209723 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:12.709682832 +0000 UTC m=+148.122376834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.310669 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.310890 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:12.810860104 +0000 UTC m=+148.223554106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.310952 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.311271 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:12.811258256 +0000 UTC m=+148.223952258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.412166 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.412607 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:12.912592353 +0000 UTC m=+148.325286355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.514096 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.514522 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.014505595 +0000 UTC m=+148.427199597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.579032 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.581115 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.581176 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.615057 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.615183 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.115160143 +0000 UTC m=+148.527854165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.615396 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.615637 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.115629407 +0000 UTC m=+148.528323409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.716309 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.716570 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.216539232 +0000 UTC m=+148.629233254 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.716668 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.717069 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.217053356 +0000 UTC m=+148.629747358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.828513 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.828696 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.32866957 +0000 UTC m=+148.741363572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.829083 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.829427 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.329415391 +0000 UTC m=+148.742109653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.885954 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" event={"ID":"3bb7aec8-881b-46d2-a40d-0758d1141f55","Type":"ContainerStarted","Data":"3060269cd438b403486cfc6fe01b9a4e5116de4cc99dbcc98ff50a0ef62fb728"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.887068 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" event={"ID":"967a2b95-097f-4abc-b5eb-e6642b7f2e97","Type":"ContainerStarted","Data":"ffa19044d88785e40bf96c10631f32eac19248ad759672d9df89a3e433956a92"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.889017 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" event={"ID":"86121519-d315-4eb4-9cd0-ab26bcd23b42","Type":"ContainerStarted","Data":"8c6c448dbbdf5a03828a09a6901008705711e220951e0cf4f887aa2d17154e6c"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.890366 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" event={"ID":"c97b71a3-678a-485a-a890-508df1e45bdf","Type":"ContainerStarted","Data":"90512a2583bf3cfc1c9fcaf7e1d343fdb97960ceefa47e6ba1a2810081181ef8"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.892782 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" event={"ID":"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8","Type":"ContainerStarted","Data":"e1bbd4f70ad84a618a02816d78ca0ae52b8d9461bcc5e94344444819b6179f87"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.892833 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" event={"ID":"cfe3f7c6-e363-4dd7-a52a-e5d43134aba8","Type":"ContainerStarted","Data":"f2aabeb9b69233b3df4e0706fa519b54fa5566960652b93a6b9b8d5d07491f71"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.895313 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" event={"ID":"7d7abe47-9752-4760-bbbf-5dca233eeb30","Type":"ContainerStarted","Data":"334775f8bda2d00bea7bf7420c9c43f3902f4bf3c82750c95da31da5573ac28e"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.901018 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4d4wh" event={"ID":"0337812a-7694-45b8-92de-1740afe87f27","Type":"ContainerStarted","Data":"4287041231b737b10433c6a19c2b97bd3598f380261f232b2ba6a54197ed51b9"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.910997 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fkn42" podStartSLOduration=125.910984727 podStartE2EDuration="2m5.910984727s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:12.908455096 +0000 UTC m=+148.321149098" watchObservedRunningTime="2025-11-26 22:41:12.910984727 +0000 UTC m=+148.323678729" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.917445 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s2w2s" event={"ID":"68c3e5aa-40f1-4600-951b-1c2460ed8466","Type":"ContainerStarted","Data":"7fe0a3dffa7b6cce9bacc863e8734eae019840e1f4b18fe17d6f9392d2c5b6fb"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.917501 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s2w2s" event={"ID":"68c3e5aa-40f1-4600-951b-1c2460ed8466","Type":"ContainerStarted","Data":"9e3fc30420a7a7550e7ff4a153fd88d16ed7420825f9e67374044fc3b83c7267"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.921290 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" event={"ID":"397c9d40-0280-4491-8aa1-2f97f28d0b9e","Type":"ContainerStarted","Data":"966959399b97cfdac763a98d922ee58bef90d82adceecf0ead143e4aafb0a59f"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.921350 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" event={"ID":"397c9d40-0280-4491-8aa1-2f97f28d0b9e","Type":"ContainerStarted","Data":"431ab3ca713de40491f69fca7e3f395aff23e1bec0032aeb5410f31e1fa1f9ad"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.923682 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nzkzs" event={"ID":"52ab88e6-8cc7-4093-a550-570320f5e62b","Type":"ContainerStarted","Data":"fcb28d7bec5e1400848fb1cedcd7028bf07744cf6fcd7d764a6296293babc7b0"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.923731 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-nzkzs" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.924279 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nzkzs" event={"ID":"52ab88e6-8cc7-4093-a550-570320f5e62b","Type":"ContainerStarted","Data":"885e78cd1a6a094b8d87b25949ef17937e8f5ba7a58c35923a0c846ec8bd3aa5"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.926093 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" event={"ID":"75f43961-be76-4bce-9d0f-9841bdd21c06","Type":"ContainerStarted","Data":"1a662ab2d48d6dbcb7b14a7a38320f131d7fd2d0bedd189bd457e9cb28504e3e"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.926138 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" event={"ID":"75f43961-be76-4bce-9d0f-9841bdd21c06","Type":"ContainerStarted","Data":"01d08ae1fdb04773c7a4f27f2d665c12502dc435a87b5123b4cfe68745efaeea"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.926636 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.928502 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" event={"ID":"e92e5871-b651-4219-ab37-6c973ea7fc92","Type":"ContainerStarted","Data":"d498cec9032d5306e2bff499d2b295ef6989b56a87d61ef727e2b1341e64a715"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.930913 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:12 crc kubenswrapper[5008]: E1126 22:41:12.931314 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.431296673 +0000 UTC m=+148.843990685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.933211 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m4q9d" event={"ID":"041c5cf5-31c5-436f-b5c2-847ee3f3708c","Type":"ContainerStarted","Data":"f70b306d91a332f8b35f07c70a32ec0284f9bcb0a00f2c2a52ec8e0f2d4897f8"} Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.934410 5008 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xsdh6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.934439 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" podUID="607a3fbf-3586-4c55-894a-f9107fc5679d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.936247 5008 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dcs8r container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.936269 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" podUID="dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.936321 5008 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-l7qvj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.936333 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" podUID="c50d9878-2526-49de-8461-8b077b1c688e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.936374 5008 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-k2f7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.936386 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" podUID="4848d003-6c90-4cc0-ae0e-1563303c80db" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.936428 5008 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-k77kt container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" start-of-body= Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.936439 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" podUID="1cf78e11-8113-4dd4-9b76-182452887bf3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.936480 5008 patch_prober.go:28] interesting pod/console-operator-58897d9998-qdkcv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.936493 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qdkcv" podUID="4f56002e-5cd7-4eb0-9228-88e40e9a9942" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.936532 5008 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rwq8l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.936545 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" podUID="97073e75-5bda-4dc4-8d80-bf408068aaef" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.951450 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-htt4l" podStartSLOduration=125.951431985 podStartE2EDuration="2m5.951431985s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:12.933706411 +0000 UTC m=+148.346400413" watchObservedRunningTime="2025-11-26 22:41:12.951431985 +0000 UTC m=+148.364125987" Nov 26 22:41:12 crc kubenswrapper[5008]: I1126 22:41:12.992201 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9qsl7" podStartSLOduration=125.992182102 podStartE2EDuration="2m5.992182102s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:12.956276111 +0000 UTC m=+148.368970113" watchObservedRunningTime="2025-11-26 22:41:12.992182102 +0000 UTC m=+148.404876104" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.016019 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jc57b" podStartSLOduration=126.015999916 podStartE2EDuration="2m6.015999916s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:12.990810473 +0000 UTC m=+148.403504475" watchObservedRunningTime="2025-11-26 22:41:13.015999916 +0000 UTC m=+148.428693908" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.034919 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.038690 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.538672289 +0000 UTC m=+148.951366381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.070780 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nt29c" podStartSLOduration=126.070424135 podStartE2EDuration="2m6.070424135s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:13.042719262 +0000 UTC m=+148.455413284" watchObservedRunningTime="2025-11-26 22:41:13.070424135 +0000 UTC m=+148.483118187" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.072477 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6x274" podStartSLOduration=126.072470522 podStartE2EDuration="2m6.072470522s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:13.023075164 +0000 UTC m=+148.435769156" watchObservedRunningTime="2025-11-26 22:41:13.072470522 +0000 UTC m=+148.485164524" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.108605 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-4d4wh" podStartSLOduration=126.108588139 podStartE2EDuration="2m6.108588139s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:13.094832385 +0000 UTC m=+148.507526387" watchObservedRunningTime="2025-11-26 22:41:13.108588139 +0000 UTC m=+148.521282141" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.136088 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nzkzs" podStartSLOduration=7.136069626 podStartE2EDuration="7.136069626s" podCreationTimestamp="2025-11-26 22:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:13.133465243 +0000 UTC m=+148.546159255" watchObservedRunningTime="2025-11-26 22:41:13.136069626 +0000 UTC m=+148.548763618" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.166662 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" podStartSLOduration=126.166646379 podStartE2EDuration="2m6.166646379s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:13.164684434 +0000 UTC m=+148.577378436" watchObservedRunningTime="2025-11-26 22:41:13.166646379 +0000 UTC m=+148.579340381" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.172542 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.173113 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.673069458 +0000 UTC m=+149.085763460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.189435 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" podStartSLOduration=126.189419234 podStartE2EDuration="2m6.189419234s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:13.186075111 +0000 UTC m=+148.598769113" watchObservedRunningTime="2025-11-26 22:41:13.189419234 +0000 UTC m=+148.602113236" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.238038 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-m4q9d" podStartSLOduration=126.2380161 podStartE2EDuration="2m6.2380161s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:13.208562558 +0000 UTC m=+148.621256560" watchObservedRunningTime="2025-11-26 22:41:13.2380161 +0000 UTC m=+148.650710102" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.256467 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s2w2s" podStartSLOduration=126.256432713 podStartE2EDuration="2m6.256432713s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:13.256291619 +0000 UTC m=+148.668985621" watchObservedRunningTime="2025-11-26 22:41:13.256432713 +0000 UTC m=+148.669126715" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.258846 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" podStartSLOduration=126.25881754 podStartE2EDuration="2m6.25881754s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:13.23872809 +0000 UTC m=+148.651422082" watchObservedRunningTime="2025-11-26 22:41:13.25881754 +0000 UTC m=+148.671511552" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.274628 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.275075 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.775060634 +0000 UTC m=+149.187754626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.375697 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.375890 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.875865516 +0000 UTC m=+149.288559518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.376101 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.376518 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.876500713 +0000 UTC m=+149.289194715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.477539 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.477750 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.977720877 +0000 UTC m=+149.390414889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.478137 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.478546 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:13.978531 +0000 UTC m=+149.391225002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.579445 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.579839 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.079807795 +0000 UTC m=+149.492501797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.584925 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:13 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:13 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:13 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.585017 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.681528 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.682000 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.181979665 +0000 UTC m=+149.594673667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.782485 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.782679 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.282648204 +0000 UTC m=+149.695342206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.783156 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.783604 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.28359356 +0000 UTC m=+149.696287642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.822772 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.822998 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.824845 5008 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-9kn85 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.825241 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" podUID="e92e5871-b651-4219-ab37-6c973ea7fc92" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.884917 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.885097 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.385071362 +0000 UTC m=+149.797765364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.885499 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.885794 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.385786701 +0000 UTC m=+149.798480703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.938265 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" event={"ID":"e547d918-c76b-4a9a-8717-ee227b89818d","Type":"ContainerStarted","Data":"fd137482df897bb3222475d83cc635c1934e1e472dbe88cdd62124b0a50054df"} Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.938878 5008 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dcs8r container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.939003 5008 patch_prober.go:28] interesting pod/console-operator-58897d9998-qdkcv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.939005 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" podUID="dfe0f94c-3d84-40e6-8ef0-8ac4d2d13f59" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.939054 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qdkcv" podUID="4f56002e-5cd7-4eb0-9228-88e40e9a9942" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.939127 5008 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rwq8l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.939154 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" podUID="97073e75-5bda-4dc4-8d80-bf408068aaef" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.939412 5008 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-k2f7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.939442 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" podUID="4848d003-6c90-4cc0-ae0e-1563303c80db" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.964180 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.964280 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.965223 5008 patch_prober.go:28] interesting pod/apiserver-76f77b778f-w5vzs container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.965276 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" podUID="397c9d40-0280-4491-8aa1-2f97f28d0b9e" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.986819 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.987014 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.486943073 +0000 UTC m=+149.899637075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:13 crc kubenswrapper[5008]: I1126 22:41:13.987150 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:13 crc kubenswrapper[5008]: E1126 22:41:13.987464 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.487452977 +0000 UTC m=+149.900146979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.088434 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.088706 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.588656361 +0000 UTC m=+150.001350363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.090200 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.092250 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.592222981 +0000 UTC m=+150.004916983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.191408 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.191689 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.691657114 +0000 UTC m=+150.104351126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.191995 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.192290 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.692278002 +0000 UTC m=+150.104972004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.293643 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.293872 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.793839955 +0000 UTC m=+150.206533957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.294012 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.294164 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.294585 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.794565105 +0000 UTC m=+150.207259167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.295655 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.396134 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.396323 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.896278193 +0000 UTC m=+150.308972195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.396585 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.396910 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.396995 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.397236 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.897218649 +0000 UTC m=+150.309912651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.404619 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.409222 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.498027 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.498342 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:14.998296079 +0000 UTC m=+150.410990081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.498423 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.502120 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.533306 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.541001 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.571197 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72g4c" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.591810 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:14 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:14 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:14 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.591873 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.599584 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.599892 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:15.099881372 +0000 UTC m=+150.512575374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.631281 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.704124 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.705220 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:15.205195501 +0000 UTC m=+150.617889513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.710413 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.710681 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:15.210669364 +0000 UTC m=+150.623363366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.811394 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.811516 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:15.311498427 +0000 UTC m=+150.724192429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.811978 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.812298 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:15.312289969 +0000 UTC m=+150.724983971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.916598 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:14 crc kubenswrapper[5008]: E1126 22:41:14.924088 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:15.424060357 +0000 UTC m=+150.836754349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.939005 5008 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-l7qvj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.939065 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" podUID="c50d9878-2526-49de-8461-8b077b1c688e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.963124 5008 generic.go:334] "Generic (PLEG): container finished" podID="41d44cf9-3859-402a-9c77-d5842d7a70a3" containerID="e11940794cb0d696f716a6fe2e3050feb2f4908c675e752f84c93fe59fc6a6d9" exitCode=0 Nov 26 22:41:14 crc kubenswrapper[5008]: I1126 22:41:14.963792 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" event={"ID":"41d44cf9-3859-402a-9c77-d5842d7a70a3","Type":"ContainerDied","Data":"e11940794cb0d696f716a6fe2e3050feb2f4908c675e752f84c93fe59fc6a6d9"} Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.021189 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:15 crc kubenswrapper[5008]: E1126 22:41:15.022488 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:15.522472582 +0000 UTC m=+150.935166674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.125670 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:15 crc kubenswrapper[5008]: E1126 22:41:15.126165 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:15.626145724 +0000 UTC m=+151.038839726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.231443 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:15 crc kubenswrapper[5008]: E1126 22:41:15.231981 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:15.731953996 +0000 UTC m=+151.144647998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.333399 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:15 crc kubenswrapper[5008]: E1126 22:41:15.333698 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:15.833683274 +0000 UTC m=+151.246377276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.439624 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:15 crc kubenswrapper[5008]: E1126 22:41:15.439899 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:15.939888647 +0000 UTC m=+151.352582649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:15 crc kubenswrapper[5008]: W1126 22:41:15.489513 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-2af132fd2c7b9cfd126f932c17e1ae1214ef562baa1ee8c8db925ca1b0ce74b6 WatchSource:0}: Error finding container 2af132fd2c7b9cfd126f932c17e1ae1214ef562baa1ee8c8db925ca1b0ce74b6: Status 404 returned error can't find the container with id 2af132fd2c7b9cfd126f932c17e1ae1214ef562baa1ee8c8db925ca1b0ce74b6 Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.540832 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:15 crc kubenswrapper[5008]: E1126 22:41:15.541308 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:16.041287776 +0000 UTC m=+151.453981778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.583165 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:15 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:15 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:15 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.583217 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.642875 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:15 crc kubenswrapper[5008]: E1126 22:41:15.643507 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:16.143494748 +0000 UTC m=+151.556188750 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.744051 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:15 crc kubenswrapper[5008]: E1126 22:41:15.744437 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:16.244417503 +0000 UTC m=+151.657111505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.846307 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:15 crc kubenswrapper[5008]: E1126 22:41:15.846680 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:16.346665145 +0000 UTC m=+151.759359137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.947750 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:15 crc kubenswrapper[5008]: E1126 22:41:15.947913 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:16.447883259 +0000 UTC m=+151.860577281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.948168 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:15 crc kubenswrapper[5008]: E1126 22:41:15.948463 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:16.448451725 +0000 UTC m=+151.861145727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.967628 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"003dc195c47c8096214c373d14fdfd0f8e0e9375b8f16fa5ed4df0f546423cfc"} Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.967674 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2af132fd2c7b9cfd126f932c17e1ae1214ef562baa1ee8c8db925ca1b0ce74b6"} Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.967856 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.968985 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d97796683f76e2e0e083ca27bdc059fcb0e37416a22f9c9d6e124d700a8cffcc"} Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.969027 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ed71cae59386706497a3b8cfc3143ed698c7cf358f536f12daa7fb345f684b28"} Nov 26 22:41:15 crc kubenswrapper[5008]: I1126 22:41:15.969928 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"35c81617ba52f6b25ae521f2f858eb8b1c1f67c5e3d3afc6f00d72fa53c85254"} Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.007814 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pbtdl"] Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.008704 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.010514 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.022379 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pbtdl"] Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.054678 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.059295 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f7c489e-32bf-4761-a529-e8ca560145ad-catalog-content\") pod \"community-operators-pbtdl\" (UID: \"3f7c489e-32bf-4761-a529-e8ca560145ad\") " pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.059434 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f7c489e-32bf-4761-a529-e8ca560145ad-utilities\") pod \"community-operators-pbtdl\" (UID: \"3f7c489e-32bf-4761-a529-e8ca560145ad\") " pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.059494 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngr7s\" (UniqueName: \"kubernetes.io/projected/3f7c489e-32bf-4761-a529-e8ca560145ad-kube-api-access-ngr7s\") pod \"community-operators-pbtdl\" (UID: \"3f7c489e-32bf-4761-a529-e8ca560145ad\") " pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:41:16 crc kubenswrapper[5008]: E1126 22:41:16.060325 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:16.560301535 +0000 UTC m=+151.972995537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.109787 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.110988 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.113356 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.113548 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.128930 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.160712 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f7c489e-32bf-4761-a529-e8ca560145ad-utilities\") pod \"community-operators-pbtdl\" (UID: \"3f7c489e-32bf-4761-a529-e8ca560145ad\") " pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.161037 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngr7s\" (UniqueName: \"kubernetes.io/projected/3f7c489e-32bf-4761-a529-e8ca560145ad-kube-api-access-ngr7s\") pod \"community-operators-pbtdl\" (UID: \"3f7c489e-32bf-4761-a529-e8ca560145ad\") " pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.161094 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.161117 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f7c489e-32bf-4761-a529-e8ca560145ad-catalog-content\") pod \"community-operators-pbtdl\" (UID: \"3f7c489e-32bf-4761-a529-e8ca560145ad\") " pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.161158 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.161187 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.161686 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f7c489e-32bf-4761-a529-e8ca560145ad-utilities\") pod \"community-operators-pbtdl\" (UID: \"3f7c489e-32bf-4761-a529-e8ca560145ad\") " pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:41:16 crc kubenswrapper[5008]: E1126 22:41:16.162301 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:16.66229031 +0000 UTC m=+152.074984312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.162754 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f7c489e-32bf-4761-a529-e8ca560145ad-catalog-content\") pod \"community-operators-pbtdl\" (UID: \"3f7c489e-32bf-4761-a529-e8ca560145ad\") " pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.182293 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngr7s\" (UniqueName: \"kubernetes.io/projected/3f7c489e-32bf-4761-a529-e8ca560145ad-kube-api-access-ngr7s\") pod \"community-operators-pbtdl\" (UID: \"3f7c489e-32bf-4761-a529-e8ca560145ad\") " pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.206607 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xqtk6"] Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.207664 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.212995 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.230493 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xqtk6"] Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.261670 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.261854 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qmpj\" (UniqueName: \"kubernetes.io/projected/278822ff-d2bf-46ef-af05-b6e32a132844-kube-api-access-6qmpj\") pod \"certified-operators-xqtk6\" (UID: \"278822ff-d2bf-46ef-af05-b6e32a132844\") " pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.261917 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.261937 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278822ff-d2bf-46ef-af05-b6e32a132844-catalog-content\") pod \"certified-operators-xqtk6\" (UID: \"278822ff-d2bf-46ef-af05-b6e32a132844\") " pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.261955 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278822ff-d2bf-46ef-af05-b6e32a132844-utilities\") pod \"certified-operators-xqtk6\" (UID: \"278822ff-d2bf-46ef-af05-b6e32a132844\") " pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.261988 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.262047 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 22:41:16 crc kubenswrapper[5008]: E1126 22:41:16.262106 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:16.762093715 +0000 UTC m=+152.174787707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.263373 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.283414 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.323187 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.362253 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41d44cf9-3859-402a-9c77-d5842d7a70a3-secret-volume\") pod \"41d44cf9-3859-402a-9c77-d5842d7a70a3\" (UID: \"41d44cf9-3859-402a-9c77-d5842d7a70a3\") " Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.362304 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41d44cf9-3859-402a-9c77-d5842d7a70a3-config-volume\") pod \"41d44cf9-3859-402a-9c77-d5842d7a70a3\" (UID: \"41d44cf9-3859-402a-9c77-d5842d7a70a3\") " Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.362388 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsds5\" (UniqueName: \"kubernetes.io/projected/41d44cf9-3859-402a-9c77-d5842d7a70a3-kube-api-access-zsds5\") pod \"41d44cf9-3859-402a-9c77-d5842d7a70a3\" (UID: \"41d44cf9-3859-402a-9c77-d5842d7a70a3\") " Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.362516 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278822ff-d2bf-46ef-af05-b6e32a132844-catalog-content\") pod \"certified-operators-xqtk6\" (UID: \"278822ff-d2bf-46ef-af05-b6e32a132844\") " pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.362537 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278822ff-d2bf-46ef-af05-b6e32a132844-utilities\") pod \"certified-operators-xqtk6\" (UID: \"278822ff-d2bf-46ef-af05-b6e32a132844\") " pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.362583 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qmpj\" (UniqueName: \"kubernetes.io/projected/278822ff-d2bf-46ef-af05-b6e32a132844-kube-api-access-6qmpj\") pod \"certified-operators-xqtk6\" (UID: \"278822ff-d2bf-46ef-af05-b6e32a132844\") " pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.362620 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:16 crc kubenswrapper[5008]: E1126 22:41:16.362840 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:16.862829465 +0000 UTC m=+152.275523467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.365453 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278822ff-d2bf-46ef-af05-b6e32a132844-catalog-content\") pod \"certified-operators-xqtk6\" (UID: \"278822ff-d2bf-46ef-af05-b6e32a132844\") " pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.366003 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41d44cf9-3859-402a-9c77-d5842d7a70a3-config-volume" (OuterVolumeSpecName: "config-volume") pod "41d44cf9-3859-402a-9c77-d5842d7a70a3" (UID: "41d44cf9-3859-402a-9c77-d5842d7a70a3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.369297 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278822ff-d2bf-46ef-af05-b6e32a132844-utilities\") pod \"certified-operators-xqtk6\" (UID: \"278822ff-d2bf-46ef-af05-b6e32a132844\") " pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.375489 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d44cf9-3859-402a-9c77-d5842d7a70a3-kube-api-access-zsds5" (OuterVolumeSpecName: "kube-api-access-zsds5") pod "41d44cf9-3859-402a-9c77-d5842d7a70a3" (UID: "41d44cf9-3859-402a-9c77-d5842d7a70a3"). InnerVolumeSpecName "kube-api-access-zsds5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.377722 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41d44cf9-3859-402a-9c77-d5842d7a70a3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "41d44cf9-3859-402a-9c77-d5842d7a70a3" (UID: "41d44cf9-3859-402a-9c77-d5842d7a70a3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.402694 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qmpj\" (UniqueName: \"kubernetes.io/projected/278822ff-d2bf-46ef-af05-b6e32a132844-kube-api-access-6qmpj\") pod \"certified-operators-xqtk6\" (UID: \"278822ff-d2bf-46ef-af05-b6e32a132844\") " pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.413710 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9s8kp"] Nov 26 22:41:16 crc kubenswrapper[5008]: E1126 22:41:16.413893 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d44cf9-3859-402a-9c77-d5842d7a70a3" containerName="collect-profiles" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.413904 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d44cf9-3859-402a-9c77-d5842d7a70a3" containerName="collect-profiles" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.413990 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d44cf9-3859-402a-9c77-d5842d7a70a3" containerName="collect-profiles" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.415283 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.430492 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9s8kp"] Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.436140 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.463172 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.463419 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77584e4e-5147-421a-b7c0-24c7403c003a-utilities\") pod \"community-operators-9s8kp\" (UID: \"77584e4e-5147-421a-b7c0-24c7403c003a\") " pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.463459 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4w6\" (UniqueName: \"kubernetes.io/projected/77584e4e-5147-421a-b7c0-24c7403c003a-kube-api-access-mm4w6\") pod \"community-operators-9s8kp\" (UID: \"77584e4e-5147-421a-b7c0-24c7403c003a\") " pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.463496 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77584e4e-5147-421a-b7c0-24c7403c003a-catalog-content\") pod \"community-operators-9s8kp\" (UID: \"77584e4e-5147-421a-b7c0-24c7403c003a\") " pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.463529 5008 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41d44cf9-3859-402a-9c77-d5842d7a70a3-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.463538 5008 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41d44cf9-3859-402a-9c77-d5842d7a70a3-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.463548 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsds5\" (UniqueName: \"kubernetes.io/projected/41d44cf9-3859-402a-9c77-d5842d7a70a3-kube-api-access-zsds5\") on node \"crc\" DevicePath \"\"" Nov 26 22:41:16 crc kubenswrapper[5008]: E1126 22:41:16.463614 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:16.963600477 +0000 UTC m=+152.376294479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.528268 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.565237 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77584e4e-5147-421a-b7c0-24c7403c003a-utilities\") pod \"community-operators-9s8kp\" (UID: \"77584e4e-5147-421a-b7c0-24c7403c003a\") " pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.565292 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4w6\" (UniqueName: \"kubernetes.io/projected/77584e4e-5147-421a-b7c0-24c7403c003a-kube-api-access-mm4w6\") pod \"community-operators-9s8kp\" (UID: \"77584e4e-5147-421a-b7c0-24c7403c003a\") " pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.565321 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.565346 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77584e4e-5147-421a-b7c0-24c7403c003a-catalog-content\") pod \"community-operators-9s8kp\" (UID: \"77584e4e-5147-421a-b7c0-24c7403c003a\") " pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.566002 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77584e4e-5147-421a-b7c0-24c7403c003a-catalog-content\") pod \"community-operators-9s8kp\" (UID: \"77584e4e-5147-421a-b7c0-24c7403c003a\") " pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.566428 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77584e4e-5147-421a-b7c0-24c7403c003a-utilities\") pod \"community-operators-9s8kp\" (UID: \"77584e4e-5147-421a-b7c0-24c7403c003a\") " pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:41:16 crc kubenswrapper[5008]: E1126 22:41:16.566734 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:17.066720833 +0000 UTC m=+152.479414835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.588196 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:16 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:16 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:16 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.588262 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.601275 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4w6\" (UniqueName: \"kubernetes.io/projected/77584e4e-5147-421a-b7c0-24c7403c003a-kube-api-access-mm4w6\") pod \"community-operators-9s8kp\" (UID: \"77584e4e-5147-421a-b7c0-24c7403c003a\") " pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.612155 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hqxrq"] Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.613022 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.624791 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqxrq"] Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.676302 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.676642 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012ef2d4-9b7c-4838-9b17-345ee0057cd1-utilities\") pod \"certified-operators-hqxrq\" (UID: \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\") " pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.676669 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012ef2d4-9b7c-4838-9b17-345ee0057cd1-catalog-content\") pod \"certified-operators-hqxrq\" (UID: \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\") " pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.676728 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5gnd\" (UniqueName: \"kubernetes.io/projected/012ef2d4-9b7c-4838-9b17-345ee0057cd1-kube-api-access-s5gnd\") pod \"certified-operators-hqxrq\" (UID: \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\") " pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:41:16 crc kubenswrapper[5008]: E1126 22:41:16.676900 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:17.176883537 +0000 UTC m=+152.589577539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.733741 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.782843 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5gnd\" (UniqueName: \"kubernetes.io/projected/012ef2d4-9b7c-4838-9b17-345ee0057cd1-kube-api-access-s5gnd\") pod \"certified-operators-hqxrq\" (UID: \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\") " pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.783166 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.783202 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012ef2d4-9b7c-4838-9b17-345ee0057cd1-utilities\") pod \"certified-operators-hqxrq\" (UID: \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\") " pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.783218 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012ef2d4-9b7c-4838-9b17-345ee0057cd1-catalog-content\") pod \"certified-operators-hqxrq\" (UID: \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\") " pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.783609 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012ef2d4-9b7c-4838-9b17-345ee0057cd1-catalog-content\") pod \"certified-operators-hqxrq\" (UID: \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\") " pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:41:16 crc kubenswrapper[5008]: E1126 22:41:16.784172 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:17.284161469 +0000 UTC m=+152.696855471 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.784503 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012ef2d4-9b7c-4838-9b17-345ee0057cd1-utilities\") pod \"certified-operators-hqxrq\" (UID: \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\") " pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.812327 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5gnd\" (UniqueName: \"kubernetes.io/projected/012ef2d4-9b7c-4838-9b17-345ee0057cd1-kube-api-access-s5gnd\") pod \"certified-operators-hqxrq\" (UID: \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\") " pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.837438 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xqtk6"] Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.850430 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pbtdl"] Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.883818 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:16 crc kubenswrapper[5008]: E1126 22:41:16.884075 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:17.384052866 +0000 UTC m=+152.796746868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.884132 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:16 crc kubenswrapper[5008]: E1126 22:41:16.884457 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:17.384446557 +0000 UTC m=+152.797140629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.887841 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 26 22:41:16 crc kubenswrapper[5008]: W1126 22:41:16.891077 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod278822ff_d2bf_46ef_af05_b6e32a132844.slice/crio-065b83a5611fc9caa8e1c26065c5b17c694188c2a631ad5235b8b55665aff6f7 WatchSource:0}: Error finding container 065b83a5611fc9caa8e1c26065c5b17c694188c2a631ad5235b8b55665aff6f7: Status 404 returned error can't find the container with id 065b83a5611fc9caa8e1c26065c5b17c694188c2a631ad5235b8b55665aff6f7 Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.941302 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:41:16 crc kubenswrapper[5008]: I1126 22:41:16.988508 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:16 crc kubenswrapper[5008]: E1126 22:41:16.988923 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:17.488899891 +0000 UTC m=+152.901593893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.018895 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbtdl" event={"ID":"3f7c489e-32bf-4761-a529-e8ca560145ad","Type":"ContainerStarted","Data":"34a3ffa0c91bb11053a3f2a3ae1e91d3c10f467743d66f71b1481a048462d11b"} Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.026361 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1210db0ff4581b21e8f0920c2017fb9010e968f68aab9d0856d919c9a5c3ad47"} Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.031002 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c","Type":"ContainerStarted","Data":"9a935efed74ee2accfead647c26de3496c5e09a8f9a308193fe02261b983cba9"} Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.033346 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" event={"ID":"41d44cf9-3859-402a-9c77-d5842d7a70a3","Type":"ContainerDied","Data":"7213b4eef1442e70f8d843bac4afd411dc746b051582a446b08baa3a4ebc4eb8"} Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.033381 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7213b4eef1442e70f8d843bac4afd411dc746b051582a446b08baa3a4ebc4eb8" Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.033423 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29403270-9mdcq" Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.041575 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xqtk6" event={"ID":"278822ff-d2bf-46ef-af05-b6e32a132844","Type":"ContainerStarted","Data":"065b83a5611fc9caa8e1c26065c5b17c694188c2a631ad5235b8b55665aff6f7"} Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.079642 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9s8kp"] Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.091621 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.092005 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:17.591987697 +0000 UTC m=+153.004681699 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: W1126 22:41:17.108761 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77584e4e_5147_421a_b7c0_24c7403c003a.slice/crio-b7904422f275eaa2ccd91db1f37effb2dcf85daaee69df157daca731bb2f3a59 WatchSource:0}: Error finding container b7904422f275eaa2ccd91db1f37effb2dcf85daaee69df157daca731bb2f3a59: Status 404 returned error can't find the container with id b7904422f275eaa2ccd91db1f37effb2dcf85daaee69df157daca731bb2f3a59 Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.192617 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.193692 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:17.693666593 +0000 UTC m=+153.106360665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.296642 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.297071 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:17.797038567 +0000 UTC m=+153.209732569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.397158 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.397345 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:17.897320825 +0000 UTC m=+153.310014827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.397439 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.397717 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:17.897709786 +0000 UTC m=+153.310403788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.466461 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqxrq"] Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.498235 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.498384 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:17.998351193 +0000 UTC m=+153.411045195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.498528 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.498825 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:17.998815266 +0000 UTC m=+153.411509268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.582282 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:17 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:17 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:17 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.582361 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.600089 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.600347 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.100313718 +0000 UTC m=+153.513007720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.600527 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.600913 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.100899464 +0000 UTC m=+153.513593466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.702345 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.702493 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.202467338 +0000 UTC m=+153.615161350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.702619 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.703856 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.203834946 +0000 UTC m=+153.616528958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.803507 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.803660 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.30363839 +0000 UTC m=+153.716332402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.803690 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.804047 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.304035861 +0000 UTC m=+153.716729863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.904586 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.904706 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.404683359 +0000 UTC m=+153.817377371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:17 crc kubenswrapper[5008]: I1126 22:41:17.904728 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:17 crc kubenswrapper[5008]: E1126 22:41:17.905106 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.405095321 +0000 UTC m=+153.817789323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.005348 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:18 crc kubenswrapper[5008]: E1126 22:41:18.005496 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.505474201 +0000 UTC m=+153.918168203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.005893 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:18 crc kubenswrapper[5008]: E1126 22:41:18.006340 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.506319785 +0000 UTC m=+153.919013787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.049054 5008 generic.go:334] "Generic (PLEG): container finished" podID="278822ff-d2bf-46ef-af05-b6e32a132844" containerID="87e2772be03699f8637580705cde1e49c0bf01ce86e59ffac408b49371c2b135" exitCode=0 Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.049126 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xqtk6" event={"ID":"278822ff-d2bf-46ef-af05-b6e32a132844","Type":"ContainerDied","Data":"87e2772be03699f8637580705cde1e49c0bf01ce86e59ffac408b49371c2b135"} Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.050452 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.050777 5008 generic.go:334] "Generic (PLEG): container finished" podID="3f7c489e-32bf-4761-a529-e8ca560145ad" containerID="5867cd062239b44c8db73c756517bff1042095d36f8205a1004312ea33092e95" exitCode=0 Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.050848 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbtdl" event={"ID":"3f7c489e-32bf-4761-a529-e8ca560145ad","Type":"ContainerDied","Data":"5867cd062239b44c8db73c756517bff1042095d36f8205a1004312ea33092e95"} Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.052261 5008 generic.go:334] "Generic (PLEG): container finished" podID="77584e4e-5147-421a-b7c0-24c7403c003a" containerID="6fe66479a1ef2566f2b33d069323211455bd174791bc9062e11262873f541897" exitCode=0 Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.052321 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9s8kp" event={"ID":"77584e4e-5147-421a-b7c0-24c7403c003a","Type":"ContainerDied","Data":"6fe66479a1ef2566f2b33d069323211455bd174791bc9062e11262873f541897"} Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.052343 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9s8kp" event={"ID":"77584e4e-5147-421a-b7c0-24c7403c003a","Type":"ContainerStarted","Data":"b7904422f275eaa2ccd91db1f37effb2dcf85daaee69df157daca731bb2f3a59"} Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.055409 5008 generic.go:334] "Generic (PLEG): container finished" podID="a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c" containerID="d82e919544ba72788dbb96c06f2caf4274580e45600fb1fc41526b9303146827" exitCode=0 Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.055468 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c","Type":"ContainerDied","Data":"d82e919544ba72788dbb96c06f2caf4274580e45600fb1fc41526b9303146827"} Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.057126 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" event={"ID":"e547d918-c76b-4a9a-8717-ee227b89818d","Type":"ContainerStarted","Data":"a3ae8e2fafe0628c5d6907a393edf0266accfcb76b862c1eb2defbda3d00d19a"} Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.057153 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" event={"ID":"e547d918-c76b-4a9a-8717-ee227b89818d","Type":"ContainerStarted","Data":"5c8e1698ecf86ee3edbd993e72c4c7cabe536cc7e748df20b4309ff09f6d5de3"} Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.058241 5008 generic.go:334] "Generic (PLEG): container finished" podID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" containerID="88989776b6848832c66d04d9f3d482fc34eadc49de88abece29f0a8c467edbd9" exitCode=0 Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.059071 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqxrq" event={"ID":"012ef2d4-9b7c-4838-9b17-345ee0057cd1","Type":"ContainerDied","Data":"88989776b6848832c66d04d9f3d482fc34eadc49de88abece29f0a8c467edbd9"} Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.059144 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqxrq" event={"ID":"012ef2d4-9b7c-4838-9b17-345ee0057cd1","Type":"ContainerStarted","Data":"73ada0635782cac75900fda5c6af80fa481ec8217905aae0d48ebe0184fa3896"} Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.108023 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:18 crc kubenswrapper[5008]: E1126 22:41:18.110159 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.610134941 +0000 UTC m=+154.022828943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.139117 5008 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.211336 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:18 crc kubenswrapper[5008]: E1126 22:41:18.211853 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.711827527 +0000 UTC m=+154.124521549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.215102 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t7zbl"] Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.216616 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.220403 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.223884 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7zbl"] Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.312761 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:18 crc kubenswrapper[5008]: E1126 22:41:18.312841 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.812825265 +0000 UTC m=+154.225519267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.313091 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.313132 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c63d5fd4-0d21-44b4-862a-293ceb4321c9-catalog-content\") pod \"redhat-marketplace-t7zbl\" (UID: \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\") " pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.313230 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c63d5fd4-0d21-44b4-862a-293ceb4321c9-utilities\") pod \"redhat-marketplace-t7zbl\" (UID: \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\") " pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.313299 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sxqc\" (UniqueName: \"kubernetes.io/projected/c63d5fd4-0d21-44b4-862a-293ceb4321c9-kube-api-access-9sxqc\") pod \"redhat-marketplace-t7zbl\" (UID: \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\") " pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:41:18 crc kubenswrapper[5008]: E1126 22:41:18.313464 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.813454343 +0000 UTC m=+154.226148425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.413798 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:18 crc kubenswrapper[5008]: E1126 22:41:18.414015 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.913989348 +0000 UTC m=+154.326683350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.414092 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c63d5fd4-0d21-44b4-862a-293ceb4321c9-utilities\") pod \"redhat-marketplace-t7zbl\" (UID: \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\") " pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.414144 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sxqc\" (UniqueName: \"kubernetes.io/projected/c63d5fd4-0d21-44b4-862a-293ceb4321c9-kube-api-access-9sxqc\") pod \"redhat-marketplace-t7zbl\" (UID: \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\") " pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.414191 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.414213 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c63d5fd4-0d21-44b4-862a-293ceb4321c9-catalog-content\") pod \"redhat-marketplace-t7zbl\" (UID: \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\") " pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.414749 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c63d5fd4-0d21-44b4-862a-293ceb4321c9-catalog-content\") pod \"redhat-marketplace-t7zbl\" (UID: \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\") " pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.414753 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c63d5fd4-0d21-44b4-862a-293ceb4321c9-utilities\") pod \"redhat-marketplace-t7zbl\" (UID: \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\") " pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:41:18 crc kubenswrapper[5008]: E1126 22:41:18.414789 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:18.914758909 +0000 UTC m=+154.327452911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.443150 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sxqc\" (UniqueName: \"kubernetes.io/projected/c63d5fd4-0d21-44b4-862a-293ceb4321c9-kube-api-access-9sxqc\") pod \"redhat-marketplace-t7zbl\" (UID: \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\") " pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.514834 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:18 crc kubenswrapper[5008]: E1126 22:41:18.515076 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:19.015040267 +0000 UTC m=+154.427734269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.534864 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.582912 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:18 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:18 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:18 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.582974 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.601071 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mkfvd"] Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.602127 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.612631 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkfvd"] Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.615872 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngh8v\" (UniqueName: \"kubernetes.io/projected/2e4023c0-2da0-4ad1-8709-e855a934018c-kube-api-access-ngh8v\") pod \"redhat-marketplace-mkfvd\" (UID: \"2e4023c0-2da0-4ad1-8709-e855a934018c\") " pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.616025 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4023c0-2da0-4ad1-8709-e855a934018c-catalog-content\") pod \"redhat-marketplace-mkfvd\" (UID: \"2e4023c0-2da0-4ad1-8709-e855a934018c\") " pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.616121 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.616222 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4023c0-2da0-4ad1-8709-e855a934018c-utilities\") pod \"redhat-marketplace-mkfvd\" (UID: \"2e4023c0-2da0-4ad1-8709-e855a934018c\") " pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:41:18 crc kubenswrapper[5008]: E1126 22:41:18.616413 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 22:41:19.116400775 +0000 UTC m=+154.529094777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xf97j" (UID: "90717c11-47b1-4265-8ea3-9c826850e812") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.692932 5008 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-26T22:41:18.1391418Z","Handler":null,"Name":""} Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.718249 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.718589 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngh8v\" (UniqueName: \"kubernetes.io/projected/2e4023c0-2da0-4ad1-8709-e855a934018c-kube-api-access-ngh8v\") pod \"redhat-marketplace-mkfvd\" (UID: \"2e4023c0-2da0-4ad1-8709-e855a934018c\") " pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.718661 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4023c0-2da0-4ad1-8709-e855a934018c-catalog-content\") pod \"redhat-marketplace-mkfvd\" (UID: \"2e4023c0-2da0-4ad1-8709-e855a934018c\") " pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.718732 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4023c0-2da0-4ad1-8709-e855a934018c-utilities\") pod \"redhat-marketplace-mkfvd\" (UID: \"2e4023c0-2da0-4ad1-8709-e855a934018c\") " pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.719257 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4023c0-2da0-4ad1-8709-e855a934018c-utilities\") pod \"redhat-marketplace-mkfvd\" (UID: \"2e4023c0-2da0-4ad1-8709-e855a934018c\") " pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:41:18 crc kubenswrapper[5008]: E1126 22:41:18.720837 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 22:41:19.220806737 +0000 UTC m=+154.633500779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.721983 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4023c0-2da0-4ad1-8709-e855a934018c-catalog-content\") pod \"redhat-marketplace-mkfvd\" (UID: \"2e4023c0-2da0-4ad1-8709-e855a934018c\") " pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.742122 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngh8v\" (UniqueName: \"kubernetes.io/projected/2e4023c0-2da0-4ad1-8709-e855a934018c-kube-api-access-ngh8v\") pod \"redhat-marketplace-mkfvd\" (UID: \"2e4023c0-2da0-4ad1-8709-e855a934018c\") " pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.771024 5008 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.771085 5008 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.789009 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7zbl"] Nov 26 22:41:18 crc kubenswrapper[5008]: W1126 22:41:18.798206 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc63d5fd4_0d21_44b4_862a_293ceb4321c9.slice/crio-968e33b2444dc2fd2372115521e881033eec0e3a711e19864ec18295c139967d WatchSource:0}: Error finding container 968e33b2444dc2fd2372115521e881033eec0e3a711e19864ec18295c139967d: Status 404 returned error can't find the container with id 968e33b2444dc2fd2372115521e881033eec0e3a711e19864ec18295c139967d Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.813317 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.820300 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.823419 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.823456 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.836436 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.851013 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9kn85" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.895002 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.912125 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xf97j\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.913677 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.913733 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.913851 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.913896 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.919569 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.921645 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.995399 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.995468 5008 patch_prober.go:28] interesting pod/apiserver-76f77b778f-w5vzs container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 26 22:41:18 crc kubenswrapper[5008]: [+]log ok Nov 26 22:41:18 crc kubenswrapper[5008]: [+]etcd ok Nov 26 22:41:18 crc kubenswrapper[5008]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 26 22:41:18 crc kubenswrapper[5008]: [+]poststarthook/generic-apiserver-start-informers ok Nov 26 22:41:18 crc kubenswrapper[5008]: [+]poststarthook/max-in-flight-filter ok Nov 26 22:41:18 crc kubenswrapper[5008]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 26 22:41:18 crc kubenswrapper[5008]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 26 22:41:18 crc kubenswrapper[5008]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 26 22:41:18 crc kubenswrapper[5008]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Nov 26 22:41:18 crc kubenswrapper[5008]: [+]poststarthook/project.openshift.io-projectcache ok Nov 26 22:41:18 crc kubenswrapper[5008]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 26 22:41:18 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-startinformers ok Nov 26 22:41:18 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 26 22:41:18 crc kubenswrapper[5008]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 26 22:41:18 crc kubenswrapper[5008]: livez check failed Nov 26 22:41:18 crc kubenswrapper[5008]: I1126 22:41:18.995526 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" podUID="397c9d40-0280-4491-8aa1-2f97f28d0b9e" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.117871 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.131537 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" event={"ID":"e547d918-c76b-4a9a-8717-ee227b89818d","Type":"ContainerStarted","Data":"f0c3e18a0b3730fffbedcea5abea590d2339dbf5287b17b204bb85c51579b36f"} Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.144868 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7zbl" event={"ID":"c63d5fd4-0d21-44b4-862a-293ceb4321c9","Type":"ContainerStarted","Data":"cff212cff9ace6446904ad73ea5da4dff62454c74c37e5dc84ab174201fa326c"} Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.144901 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7zbl" event={"ID":"c63d5fd4-0d21-44b4-862a-293ceb4321c9","Type":"ContainerStarted","Data":"968e33b2444dc2fd2372115521e881033eec0e3a711e19864ec18295c139967d"} Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.165901 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-lbcxn" podStartSLOduration=13.165887443 podStartE2EDuration="13.165887443s" podCreationTimestamp="2025-11-26 22:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:19.164297239 +0000 UTC m=+154.576991251" watchObservedRunningTime="2025-11-26 22:41:19.165887443 +0000 UTC m=+154.578581445" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.192881 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.192920 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.199594 5008 patch_prober.go:28] interesting pod/console-f9d7485db-65m9h container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.199661 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-65m9h" podUID="3e07175e-42fa-4065-adb1-3469e75ea4d8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.211553 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-qdkcv" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.211856 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5t8g6"] Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.213473 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.216714 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.227153 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66bc36d-82f1-4825-bb7d-a89eea9587e5-catalog-content\") pod \"redhat-operators-5t8g6\" (UID: \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\") " pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.227238 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66bc36d-82f1-4825-bb7d-a89eea9587e5-utilities\") pod \"redhat-operators-5t8g6\" (UID: \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\") " pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.227312 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd7r7\" (UniqueName: \"kubernetes.io/projected/d66bc36d-82f1-4825-bb7d-a89eea9587e5-kube-api-access-bd7r7\") pod \"redhat-operators-5t8g6\" (UID: \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\") " pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.245094 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5t8g6"] Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.294958 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkfvd"] Nov 26 22:41:19 crc kubenswrapper[5008]: W1126 22:41:19.311010 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e4023c0_2da0_4ad1_8709_e855a934018c.slice/crio-efb5e533363e50b21fc9c20a09644f5ce065f6d03bc2acb2356ac74f0c300f55 WatchSource:0}: Error finding container efb5e533363e50b21fc9c20a09644f5ce065f6d03bc2acb2356ac74f0c300f55: Status 404 returned error can't find the container with id efb5e533363e50b21fc9c20a09644f5ce065f6d03bc2acb2356ac74f0c300f55 Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.332228 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd7r7\" (UniqueName: \"kubernetes.io/projected/d66bc36d-82f1-4825-bb7d-a89eea9587e5-kube-api-access-bd7r7\") pod \"redhat-operators-5t8g6\" (UID: \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\") " pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.332299 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66bc36d-82f1-4825-bb7d-a89eea9587e5-catalog-content\") pod \"redhat-operators-5t8g6\" (UID: \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\") " pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.332339 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66bc36d-82f1-4825-bb7d-a89eea9587e5-utilities\") pod \"redhat-operators-5t8g6\" (UID: \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\") " pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.332707 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66bc36d-82f1-4825-bb7d-a89eea9587e5-utilities\") pod \"redhat-operators-5t8g6\" (UID: \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\") " pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.333439 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66bc36d-82f1-4825-bb7d-a89eea9587e5-catalog-content\") pod \"redhat-operators-5t8g6\" (UID: \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\") " pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.336225 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qvj" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.349788 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dcs8r" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.351512 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd7r7\" (UniqueName: \"kubernetes.io/projected/d66bc36d-82f1-4825-bb7d-a89eea9587e5-kube-api-access-bd7r7\") pod \"redhat-operators-5t8g6\" (UID: \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\") " pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.372696 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.406542 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2f7k" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.459606 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.526236 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.565543 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.578738 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.581617 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:19 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:19 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:19 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.581662 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.603237 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ccg9j"] Nov 26 22:41:19 crc kubenswrapper[5008]: E1126 22:41:19.603424 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c" containerName="pruner" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.603434 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c" containerName="pruner" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.603528 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c" containerName="pruner" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.604154 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.614360 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ccg9j"] Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.637082 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c-kubelet-dir\") pod \"a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c\" (UID: \"a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c\") " Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.637150 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c-kube-api-access\") pod \"a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c\" (UID: \"a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c\") " Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.638104 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c" (UID: "a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.644168 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c" (UID: "a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.662248 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xf97j"] Nov 26 22:41:19 crc kubenswrapper[5008]: W1126 22:41:19.672134 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90717c11_47b1_4265_8ea3_9c826850e812.slice/crio-65f6fc9f585e3217afad8b36c37d9225947b001f04fc6355ad8b1f19ee23f8e4 WatchSource:0}: Error finding container 65f6fc9f585e3217afad8b36c37d9225947b001f04fc6355ad8b1f19ee23f8e4: Status 404 returned error can't find the container with id 65f6fc9f585e3217afad8b36c37d9225947b001f04fc6355ad8b1f19ee23f8e4 Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.739221 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94caca18-eed8-46f1-818d-51433f83ae9c-catalog-content\") pod \"redhat-operators-ccg9j\" (UID: \"94caca18-eed8-46f1-818d-51433f83ae9c\") " pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.739269 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94caca18-eed8-46f1-818d-51433f83ae9c-utilities\") pod \"redhat-operators-ccg9j\" (UID: \"94caca18-eed8-46f1-818d-51433f83ae9c\") " pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.739289 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r2jn\" (UniqueName: \"kubernetes.io/projected/94caca18-eed8-46f1-818d-51433f83ae9c-kube-api-access-2r2jn\") pod \"redhat-operators-ccg9j\" (UID: \"94caca18-eed8-46f1-818d-51433f83ae9c\") " pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.739334 5008 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.739346 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.788774 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5t8g6"] Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.840194 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94caca18-eed8-46f1-818d-51433f83ae9c-catalog-content\") pod \"redhat-operators-ccg9j\" (UID: \"94caca18-eed8-46f1-818d-51433f83ae9c\") " pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.840243 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94caca18-eed8-46f1-818d-51433f83ae9c-utilities\") pod \"redhat-operators-ccg9j\" (UID: \"94caca18-eed8-46f1-818d-51433f83ae9c\") " pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.840268 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r2jn\" (UniqueName: \"kubernetes.io/projected/94caca18-eed8-46f1-818d-51433f83ae9c-kube-api-access-2r2jn\") pod \"redhat-operators-ccg9j\" (UID: \"94caca18-eed8-46f1-818d-51433f83ae9c\") " pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.840926 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94caca18-eed8-46f1-818d-51433f83ae9c-catalog-content\") pod \"redhat-operators-ccg9j\" (UID: \"94caca18-eed8-46f1-818d-51433f83ae9c\") " pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.841150 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94caca18-eed8-46f1-818d-51433f83ae9c-utilities\") pod \"redhat-operators-ccg9j\" (UID: \"94caca18-eed8-46f1-818d-51433f83ae9c\") " pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.855047 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r2jn\" (UniqueName: \"kubernetes.io/projected/94caca18-eed8-46f1-818d-51433f83ae9c-kube-api-access-2r2jn\") pod \"redhat-operators-ccg9j\" (UID: \"94caca18-eed8-46f1-818d-51433f83ae9c\") " pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:41:19 crc kubenswrapper[5008]: I1126 22:41:19.923786 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.152208 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5t8g6" event={"ID":"d66bc36d-82f1-4825-bb7d-a89eea9587e5","Type":"ContainerStarted","Data":"faa8d9d66693982b4caeaf2ce53f1987a419c52a55e1115cc14616390913e8d1"} Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.154450 5008 generic.go:334] "Generic (PLEG): container finished" podID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" containerID="cff212cff9ace6446904ad73ea5da4dff62454c74c37e5dc84ab174201fa326c" exitCode=0 Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.154517 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7zbl" event={"ID":"c63d5fd4-0d21-44b4-862a-293ceb4321c9","Type":"ContainerDied","Data":"cff212cff9ace6446904ad73ea5da4dff62454c74c37e5dc84ab174201fa326c"} Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.155622 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkfvd" event={"ID":"2e4023c0-2da0-4ad1-8709-e855a934018c","Type":"ContainerStarted","Data":"efb5e533363e50b21fc9c20a09644f5ce065f6d03bc2acb2356ac74f0c300f55"} Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.156873 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a2d73a3e-6cc1-4f6c-97cb-b1b091f78b7c","Type":"ContainerDied","Data":"9a935efed74ee2accfead647c26de3496c5e09a8f9a308193fe02261b983cba9"} Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.156900 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a935efed74ee2accfead647c26de3496c5e09a8f9a308193fe02261b983cba9" Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.157018 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.167191 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" event={"ID":"90717c11-47b1-4265-8ea3-9c826850e812","Type":"ContainerStarted","Data":"65f6fc9f585e3217afad8b36c37d9225947b001f04fc6355ad8b1f19ee23f8e4"} Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.308245 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.310504 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.319549 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.320110 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.320291 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.380288 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ccg9j"] Nov 26 22:41:20 crc kubenswrapper[5008]: W1126 22:41:20.386638 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94caca18_eed8_46f1_818d_51433f83ae9c.slice/crio-8e551cf9f526dc582248f43b650895463870e77d77d174e8417fe5d9dd69f8af WatchSource:0}: Error finding container 8e551cf9f526dc582248f43b650895463870e77d77d174e8417fe5d9dd69f8af: Status 404 returned error can't find the container with id 8e551cf9f526dc582248f43b650895463870e77d77d174e8417fe5d9dd69f8af Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.449166 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/73a237a3-af26-4376-a287-223050c2334a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"73a237a3-af26-4376-a287-223050c2334a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.449226 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73a237a3-af26-4376-a287-223050c2334a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"73a237a3-af26-4376-a287-223050c2334a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.550813 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/73a237a3-af26-4376-a287-223050c2334a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"73a237a3-af26-4376-a287-223050c2334a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.550890 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73a237a3-af26-4376-a287-223050c2334a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"73a237a3-af26-4376-a287-223050c2334a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.551020 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73a237a3-af26-4376-a287-223050c2334a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"73a237a3-af26-4376-a287-223050c2334a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.566061 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/73a237a3-af26-4376-a287-223050c2334a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"73a237a3-af26-4376-a287-223050c2334a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.582514 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:20 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:20 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:20 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.582576 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:20 crc kubenswrapper[5008]: I1126 22:41:20.659693 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 22:41:21 crc kubenswrapper[5008]: I1126 22:41:21.130290 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 26 22:41:21 crc kubenswrapper[5008]: W1126 22:41:21.140949 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod73a237a3_af26_4376_a287_223050c2334a.slice/crio-7825b8e15d8d25d56783013bc8503d40c7bef8ddc3eb430fdd92385c59f5fa55 WatchSource:0}: Error finding container 7825b8e15d8d25d56783013bc8503d40c7bef8ddc3eb430fdd92385c59f5fa55: Status 404 returned error can't find the container with id 7825b8e15d8d25d56783013bc8503d40c7bef8ddc3eb430fdd92385c59f5fa55 Nov 26 22:41:21 crc kubenswrapper[5008]: I1126 22:41:21.172793 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"73a237a3-af26-4376-a287-223050c2334a","Type":"ContainerStarted","Data":"7825b8e15d8d25d56783013bc8503d40c7bef8ddc3eb430fdd92385c59f5fa55"} Nov 26 22:41:21 crc kubenswrapper[5008]: I1126 22:41:21.173561 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccg9j" event={"ID":"94caca18-eed8-46f1-818d-51433f83ae9c","Type":"ContainerStarted","Data":"8e551cf9f526dc582248f43b650895463870e77d77d174e8417fe5d9dd69f8af"} Nov 26 22:41:21 crc kubenswrapper[5008]: I1126 22:41:21.417460 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nzkzs" Nov 26 22:41:21 crc kubenswrapper[5008]: I1126 22:41:21.584602 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:21 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:21 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:21 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:21 crc kubenswrapper[5008]: I1126 22:41:21.584664 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:22 crc kubenswrapper[5008]: I1126 22:41:22.182425 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkfvd" event={"ID":"2e4023c0-2da0-4ad1-8709-e855a934018c","Type":"ContainerDied","Data":"5bfaf6ca0c6248e7aa8da4a05e0302b323fd206b26f8bb8279dbdef2f3ace0c0"} Nov 26 22:41:22 crc kubenswrapper[5008]: I1126 22:41:22.182365 5008 generic.go:334] "Generic (PLEG): container finished" podID="2e4023c0-2da0-4ad1-8709-e855a934018c" containerID="5bfaf6ca0c6248e7aa8da4a05e0302b323fd206b26f8bb8279dbdef2f3ace0c0" exitCode=0 Nov 26 22:41:22 crc kubenswrapper[5008]: I1126 22:41:22.185865 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" event={"ID":"90717c11-47b1-4265-8ea3-9c826850e812","Type":"ContainerStarted","Data":"746247554c4ebdf934c0e359d585d6bd9869ed384da89958c8c612c5f281dadf"} Nov 26 22:41:22 crc kubenswrapper[5008]: I1126 22:41:22.194558 5008 generic.go:334] "Generic (PLEG): container finished" podID="94caca18-eed8-46f1-818d-51433f83ae9c" containerID="7f39b45e537c81455d246567bc429140a911db5e87e6db691b9b6e975590a514" exitCode=0 Nov 26 22:41:22 crc kubenswrapper[5008]: I1126 22:41:22.194620 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccg9j" event={"ID":"94caca18-eed8-46f1-818d-51433f83ae9c","Type":"ContainerDied","Data":"7f39b45e537c81455d246567bc429140a911db5e87e6db691b9b6e975590a514"} Nov 26 22:41:22 crc kubenswrapper[5008]: I1126 22:41:22.196457 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5t8g6" event={"ID":"d66bc36d-82f1-4825-bb7d-a89eea9587e5","Type":"ContainerStarted","Data":"6f70281f701131d2e0d3fbac329428cd7f3d336a3952d2e0bb0845b5111ef1be"} Nov 26 22:41:22 crc kubenswrapper[5008]: I1126 22:41:22.580699 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:22 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:22 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:22 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:22 crc kubenswrapper[5008]: I1126 22:41:22.580792 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:23 crc kubenswrapper[5008]: I1126 22:41:23.203472 5008 generic.go:334] "Generic (PLEG): container finished" podID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" containerID="6f70281f701131d2e0d3fbac329428cd7f3d336a3952d2e0bb0845b5111ef1be" exitCode=0 Nov 26 22:41:23 crc kubenswrapper[5008]: I1126 22:41:23.203545 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5t8g6" event={"ID":"d66bc36d-82f1-4825-bb7d-a89eea9587e5","Type":"ContainerDied","Data":"6f70281f701131d2e0d3fbac329428cd7f3d336a3952d2e0bb0845b5111ef1be"} Nov 26 22:41:23 crc kubenswrapper[5008]: I1126 22:41:23.205936 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"73a237a3-af26-4376-a287-223050c2334a","Type":"ContainerStarted","Data":"884e5639c576a7499cbee45b7c1686ab6cd5effb3405310b8772e923da57e597"} Nov 26 22:41:23 crc kubenswrapper[5008]: I1126 22:41:23.270812 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" podStartSLOduration=136.270790701 podStartE2EDuration="2m16.270790701s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:23.266310696 +0000 UTC m=+158.679004718" watchObservedRunningTime="2025-11-26 22:41:23.270790701 +0000 UTC m=+158.683484693" Nov 26 22:41:23 crc kubenswrapper[5008]: I1126 22:41:23.581789 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:23 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:23 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:23 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:23 crc kubenswrapper[5008]: I1126 22:41:23.582178 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:23 crc kubenswrapper[5008]: I1126 22:41:23.970285 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:23 crc kubenswrapper[5008]: I1126 22:41:23.980575 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-w5vzs" Nov 26 22:41:24 crc kubenswrapper[5008]: I1126 22:41:24.233404 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.233388045 podStartE2EDuration="4.233388045s" podCreationTimestamp="2025-11-26 22:41:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:41:24.231172723 +0000 UTC m=+159.643866735" watchObservedRunningTime="2025-11-26 22:41:24.233388045 +0000 UTC m=+159.646082047" Nov 26 22:41:24 crc kubenswrapper[5008]: I1126 22:41:24.585225 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:24 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:24 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:24 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:24 crc kubenswrapper[5008]: I1126 22:41:24.585290 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:25 crc kubenswrapper[5008]: I1126 22:41:25.225221 5008 generic.go:334] "Generic (PLEG): container finished" podID="73a237a3-af26-4376-a287-223050c2334a" containerID="884e5639c576a7499cbee45b7c1686ab6cd5effb3405310b8772e923da57e597" exitCode=0 Nov 26 22:41:25 crc kubenswrapper[5008]: I1126 22:41:25.225304 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"73a237a3-af26-4376-a287-223050c2334a","Type":"ContainerDied","Data":"884e5639c576a7499cbee45b7c1686ab6cd5effb3405310b8772e923da57e597"} Nov 26 22:41:25 crc kubenswrapper[5008]: I1126 22:41:25.581048 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:25 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:25 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:25 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:25 crc kubenswrapper[5008]: I1126 22:41:25.581296 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:26 crc kubenswrapper[5008]: I1126 22:41:26.581481 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:26 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:26 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:26 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:26 crc kubenswrapper[5008]: I1126 22:41:26.581547 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:27 crc kubenswrapper[5008]: I1126 22:41:27.581795 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:27 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:27 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:27 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:27 crc kubenswrapper[5008]: I1126 22:41:27.582180 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:28 crc kubenswrapper[5008]: I1126 22:41:28.186652 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:41:28 crc kubenswrapper[5008]: I1126 22:41:28.586272 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:28 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:28 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:28 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:28 crc kubenswrapper[5008]: I1126 22:41:28.586333 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:28 crc kubenswrapper[5008]: I1126 22:41:28.911854 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:41:28 crc kubenswrapper[5008]: I1126 22:41:28.911912 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:41:28 crc kubenswrapper[5008]: I1126 22:41:28.911932 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:41:28 crc kubenswrapper[5008]: I1126 22:41:28.912030 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:41:29 crc kubenswrapper[5008]: I1126 22:41:29.119586 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:29 crc kubenswrapper[5008]: I1126 22:41:29.192751 5008 patch_prober.go:28] interesting pod/console-f9d7485db-65m9h container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Nov 26 22:41:29 crc kubenswrapper[5008]: I1126 22:41:29.192801 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-65m9h" podUID="3e07175e-42fa-4065-adb1-3469e75ea4d8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Nov 26 22:41:29 crc kubenswrapper[5008]: I1126 22:41:29.280938 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:41:29 crc kubenswrapper[5008]: I1126 22:41:29.281012 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:41:29 crc kubenswrapper[5008]: I1126 22:41:29.581006 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:29 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:29 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:29 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:29 crc kubenswrapper[5008]: I1126 22:41:29.581071 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:29 crc kubenswrapper[5008]: I1126 22:41:29.893432 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:41:29 crc kubenswrapper[5008]: I1126 22:41:29.914882 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7adf9a69-5de6-4710-b394-968387df9ae6-metrics-certs\") pod \"network-metrics-daemon-xplkg\" (UID: \"7adf9a69-5de6-4710-b394-968387df9ae6\") " pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:41:30 crc kubenswrapper[5008]: I1126 22:41:30.031385 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xplkg" Nov 26 22:41:30 crc kubenswrapper[5008]: I1126 22:41:30.581144 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:30 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:30 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:30 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:30 crc kubenswrapper[5008]: I1126 22:41:30.581207 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:31 crc kubenswrapper[5008]: I1126 22:41:31.581664 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:31 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:31 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:31 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:31 crc kubenswrapper[5008]: I1126 22:41:31.581730 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:32 crc kubenswrapper[5008]: I1126 22:41:32.581583 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:32 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:32 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:32 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:32 crc kubenswrapper[5008]: I1126 22:41:32.581826 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:33 crc kubenswrapper[5008]: I1126 22:41:33.581817 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:33 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:33 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:33 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:33 crc kubenswrapper[5008]: I1126 22:41:33.581894 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:34 crc kubenswrapper[5008]: I1126 22:41:34.580238 5008 patch_prober.go:28] interesting pod/router-default-5444994796-qhtsg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 22:41:34 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Nov 26 22:41:34 crc kubenswrapper[5008]: [+]process-running ok Nov 26 22:41:34 crc kubenswrapper[5008]: healthz check failed Nov 26 22:41:34 crc kubenswrapper[5008]: I1126 22:41:34.580295 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtsg" podUID="f8023be5-ae5d-4ca1-8541-6ec8d838e1df" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 22:41:35 crc kubenswrapper[5008]: I1126 22:41:35.282998 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-p5xvf_743cf3d7-7296-4b87-94e3-67fc66dacca7/cluster-samples-operator/0.log" Nov 26 22:41:35 crc kubenswrapper[5008]: I1126 22:41:35.283076 5008 generic.go:334] "Generic (PLEG): container finished" podID="743cf3d7-7296-4b87-94e3-67fc66dacca7" containerID="537251dabc7a71c7b17804c1a6fdc6b6577da5d2bda726d33bfdaa7e899b79b1" exitCode=2 Nov 26 22:41:35 crc kubenswrapper[5008]: I1126 22:41:35.283132 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" event={"ID":"743cf3d7-7296-4b87-94e3-67fc66dacca7","Type":"ContainerDied","Data":"537251dabc7a71c7b17804c1a6fdc6b6577da5d2bda726d33bfdaa7e899b79b1"} Nov 26 22:41:35 crc kubenswrapper[5008]: I1126 22:41:35.283720 5008 scope.go:117] "RemoveContainer" containerID="537251dabc7a71c7b17804c1a6fdc6b6577da5d2bda726d33bfdaa7e899b79b1" Nov 26 22:41:35 crc kubenswrapper[5008]: I1126 22:41:35.594946 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:35 crc kubenswrapper[5008]: I1126 22:41:35.599491 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-qhtsg" Nov 26 22:41:35 crc kubenswrapper[5008]: I1126 22:41:35.793940 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 22:41:35 crc kubenswrapper[5008]: I1126 22:41:35.882291 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/73a237a3-af26-4376-a287-223050c2334a-kube-api-access\") pod \"73a237a3-af26-4376-a287-223050c2334a\" (UID: \"73a237a3-af26-4376-a287-223050c2334a\") " Nov 26 22:41:35 crc kubenswrapper[5008]: I1126 22:41:35.882379 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73a237a3-af26-4376-a287-223050c2334a-kubelet-dir\") pod \"73a237a3-af26-4376-a287-223050c2334a\" (UID: \"73a237a3-af26-4376-a287-223050c2334a\") " Nov 26 22:41:35 crc kubenswrapper[5008]: I1126 22:41:35.882643 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73a237a3-af26-4376-a287-223050c2334a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "73a237a3-af26-4376-a287-223050c2334a" (UID: "73a237a3-af26-4376-a287-223050c2334a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:41:35 crc kubenswrapper[5008]: I1126 22:41:35.889130 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a237a3-af26-4376-a287-223050c2334a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "73a237a3-af26-4376-a287-223050c2334a" (UID: "73a237a3-af26-4376-a287-223050c2334a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:41:35 crc kubenswrapper[5008]: I1126 22:41:35.983003 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/73a237a3-af26-4376-a287-223050c2334a-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 22:41:35 crc kubenswrapper[5008]: I1126 22:41:35.983043 5008 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73a237a3-af26-4376-a287-223050c2334a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 22:41:36 crc kubenswrapper[5008]: I1126 22:41:36.290093 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"73a237a3-af26-4376-a287-223050c2334a","Type":"ContainerDied","Data":"7825b8e15d8d25d56783013bc8503d40c7bef8ddc3eb430fdd92385c59f5fa55"} Nov 26 22:41:36 crc kubenswrapper[5008]: I1126 22:41:36.290120 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 22:41:36 crc kubenswrapper[5008]: I1126 22:41:36.290129 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7825b8e15d8d25d56783013bc8503d40c7bef8ddc3eb430fdd92385c59f5fa55" Nov 26 22:41:38 crc kubenswrapper[5008]: I1126 22:41:38.912391 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:41:38 crc kubenswrapper[5008]: I1126 22:41:38.912498 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:41:38 crc kubenswrapper[5008]: I1126 22:41:38.912547 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:41:38 crc kubenswrapper[5008]: I1126 22:41:38.912583 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-t29mr" Nov 26 22:41:38 crc kubenswrapper[5008]: I1126 22:41:38.912598 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:41:38 crc kubenswrapper[5008]: I1126 22:41:38.913379 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:41:38 crc kubenswrapper[5008]: I1126 22:41:38.913460 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:41:38 crc kubenswrapper[5008]: I1126 22:41:38.913592 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"d5cabb6f5b416e08c83138ff9ad0c4e402f6752eac32b72f4939e79ee3775035"} pod="openshift-console/downloads-7954f5f757-t29mr" containerMessage="Container download-server failed liveness probe, will be restarted" Nov 26 22:41:38 crc kubenswrapper[5008]: I1126 22:41:38.913797 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" containerID="cri-o://d5cabb6f5b416e08c83138ff9ad0c4e402f6752eac32b72f4939e79ee3775035" gracePeriod=2 Nov 26 22:41:39 crc kubenswrapper[5008]: I1126 22:41:39.127340 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:41:39 crc kubenswrapper[5008]: I1126 22:41:39.193025 5008 patch_prober.go:28] interesting pod/console-f9d7485db-65m9h container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Nov 26 22:41:39 crc kubenswrapper[5008]: I1126 22:41:39.193087 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-65m9h" podUID="3e07175e-42fa-4065-adb1-3469e75ea4d8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Nov 26 22:41:41 crc kubenswrapper[5008]: I1126 22:41:41.328011 5008 generic.go:334] "Generic (PLEG): container finished" podID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerID="d5cabb6f5b416e08c83138ff9ad0c4e402f6752eac32b72f4939e79ee3775035" exitCode=0 Nov 26 22:41:41 crc kubenswrapper[5008]: I1126 22:41:41.328141 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t29mr" event={"ID":"6f72cb1c-2994-4ed4-8333-b7498c1615bf","Type":"ContainerDied","Data":"d5cabb6f5b416e08c83138ff9ad0c4e402f6752eac32b72f4939e79ee3775035"} Nov 26 22:41:47 crc kubenswrapper[5008]: E1126 22:41:47.497817 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 26 22:41:47 crc kubenswrapper[5008]: E1126 22:41:47.498480 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6qmpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-xqtk6_openshift-marketplace(278822ff-d2bf-46ef-af05-b6e32a132844): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 22:41:47 crc kubenswrapper[5008]: E1126 22:41:47.499799 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-xqtk6" podUID="278822ff-d2bf-46ef-af05-b6e32a132844" Nov 26 22:41:48 crc kubenswrapper[5008]: I1126 22:41:48.913416 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:41:48 crc kubenswrapper[5008]: I1126 22:41:48.913483 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:41:49 crc kubenswrapper[5008]: I1126 22:41:49.197705 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:49 crc kubenswrapper[5008]: I1126 22:41:49.203480 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-65m9h" Nov 26 22:41:49 crc kubenswrapper[5008]: I1126 22:41:49.639831 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7hhl" Nov 26 22:41:49 crc kubenswrapper[5008]: E1126 22:41:49.974460 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-xqtk6" podUID="278822ff-d2bf-46ef-af05-b6e32a132844" Nov 26 22:41:53 crc kubenswrapper[5008]: E1126 22:41:53.209262 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 26 22:41:53 crc kubenswrapper[5008]: E1126 22:41:53.209725 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ngr7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-pbtdl_openshift-marketplace(3f7c489e-32bf-4761-a529-e8ca560145ad): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 22:41:53 crc kubenswrapper[5008]: E1126 22:41:53.211425 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-pbtdl" podUID="3f7c489e-32bf-4761-a529-e8ca560145ad" Nov 26 22:41:54 crc kubenswrapper[5008]: I1126 22:41:54.579647 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 22:41:55 crc kubenswrapper[5008]: E1126 22:41:55.088058 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 26 22:41:55 crc kubenswrapper[5008]: E1126 22:41:55.088290 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s5gnd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hqxrq_openshift-marketplace(012ef2d4-9b7c-4838-9b17-345ee0057cd1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 22:41:55 crc kubenswrapper[5008]: E1126 22:41:55.089619 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hqxrq" podUID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" Nov 26 22:41:55 crc kubenswrapper[5008]: E1126 22:41:55.094377 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 26 22:41:55 crc kubenswrapper[5008]: E1126 22:41:55.094507 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mm4w6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9s8kp_openshift-marketplace(77584e4e-5147-421a-b7c0-24c7403c003a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 22:41:55 crc kubenswrapper[5008]: E1126 22:41:55.095680 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9s8kp" podUID="77584e4e-5147-421a-b7c0-24c7403c003a" Nov 26 22:41:58 crc kubenswrapper[5008]: I1126 22:41:58.911991 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:41:58 crc kubenswrapper[5008]: I1126 22:41:58.912399 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:41:59 crc kubenswrapper[5008]: I1126 22:41:59.281829 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:41:59 crc kubenswrapper[5008]: I1126 22:41:59.282014 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:41:59 crc kubenswrapper[5008]: E1126 22:41:59.619678 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-9s8kp" podUID="77584e4e-5147-421a-b7c0-24c7403c003a" Nov 26 22:41:59 crc kubenswrapper[5008]: E1126 22:41:59.620385 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hqxrq" podUID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" Nov 26 22:41:59 crc kubenswrapper[5008]: E1126 22:41:59.620430 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-pbtdl" podUID="3f7c489e-32bf-4761-a529-e8ca560145ad" Nov 26 22:42:05 crc kubenswrapper[5008]: E1126 22:42:05.409426 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 26 22:42:05 crc kubenswrapper[5008]: E1126 22:42:05.410023 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bd7r7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5t8g6_openshift-marketplace(d66bc36d-82f1-4825-bb7d-a89eea9587e5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 22:42:05 crc kubenswrapper[5008]: E1126 22:42:05.411338 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5t8g6" podUID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" Nov 26 22:42:08 crc kubenswrapper[5008]: I1126 22:42:08.912231 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:42:08 crc kubenswrapper[5008]: I1126 22:42:08.912568 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:42:10 crc kubenswrapper[5008]: E1126 22:42:10.753318 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5t8g6" podUID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" Nov 26 22:42:11 crc kubenswrapper[5008]: I1126 22:42:11.207397 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xplkg"] Nov 26 22:42:11 crc kubenswrapper[5008]: I1126 22:42:11.513052 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xplkg" event={"ID":"7adf9a69-5de6-4710-b394-968387df9ae6","Type":"ContainerStarted","Data":"b80eacb5fa49d62284b3ff1fda2288320617fbee08d1da78a6ab5e0a47bbd0c5"} Nov 26 22:42:13 crc kubenswrapper[5008]: I1126 22:42:13.531803 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-p5xvf_743cf3d7-7296-4b87-94e3-67fc66dacca7/cluster-samples-operator/0.log" Nov 26 22:42:13 crc kubenswrapper[5008]: I1126 22:42:13.532183 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p5xvf" event={"ID":"743cf3d7-7296-4b87-94e3-67fc66dacca7","Type":"ContainerStarted","Data":"d272a6424e3e234a8bf331f45da7fbf79eb93c50204af4290738d20bdbf26a25"} Nov 26 22:42:15 crc kubenswrapper[5008]: I1126 22:42:15.551767 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xplkg" event={"ID":"7adf9a69-5de6-4710-b394-968387df9ae6","Type":"ContainerStarted","Data":"1da0a70723a085f2e842f23bf79139f9f59f417bef7af63017723030811a7227"} Nov 26 22:42:15 crc kubenswrapper[5008]: I1126 22:42:15.554217 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t29mr" event={"ID":"6f72cb1c-2994-4ed4-8333-b7498c1615bf","Type":"ContainerStarted","Data":"0305166d9b7c3a0bd0a07ce529d712b9dc8046c0e79bd4928e6626abca0ddd64"} Nov 26 22:42:15 crc kubenswrapper[5008]: I1126 22:42:15.554400 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-t29mr" Nov 26 22:42:15 crc kubenswrapper[5008]: I1126 22:42:15.554778 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:42:15 crc kubenswrapper[5008]: I1126 22:42:15.554825 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:42:16 crc kubenswrapper[5008]: I1126 22:42:16.561910 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:42:16 crc kubenswrapper[5008]: I1126 22:42:16.562003 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:42:18 crc kubenswrapper[5008]: E1126 22:42:18.059467 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 26 22:42:18 crc kubenswrapper[5008]: E1126 22:42:18.059755 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ngh8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mkfvd_openshift-marketplace(2e4023c0-2da0-4ad1-8709-e855a934018c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 22:42:18 crc kubenswrapper[5008]: E1126 22:42:18.061057 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mkfvd" podUID="2e4023c0-2da0-4ad1-8709-e855a934018c" Nov 26 22:42:18 crc kubenswrapper[5008]: E1126 22:42:18.538590 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 26 22:42:18 crc kubenswrapper[5008]: E1126 22:42:18.538813 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9sxqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-t7zbl_openshift-marketplace(c63d5fd4-0d21-44b4-862a-293ceb4321c9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 22:42:18 crc kubenswrapper[5008]: E1126 22:42:18.540123 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-t7zbl" podUID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" Nov 26 22:42:18 crc kubenswrapper[5008]: I1126 22:42:18.912497 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:42:18 crc kubenswrapper[5008]: I1126 22:42:18.912824 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:42:18 crc kubenswrapper[5008]: I1126 22:42:18.912644 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-t29mr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 26 22:42:18 crc kubenswrapper[5008]: I1126 22:42:18.913237 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t29mr" podUID="6f72cb1c-2994-4ed4-8333-b7498c1615bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 26 22:42:28 crc kubenswrapper[5008]: I1126 22:42:28.925062 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-t29mr" Nov 26 22:42:29 crc kubenswrapper[5008]: I1126 22:42:29.281203 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:42:29 crc kubenswrapper[5008]: I1126 22:42:29.281278 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:42:29 crc kubenswrapper[5008]: I1126 22:42:29.281357 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:42:29 crc kubenswrapper[5008]: I1126 22:42:29.282080 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574"} pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 22:42:29 crc kubenswrapper[5008]: I1126 22:42:29.282172 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" containerID="cri-o://c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574" gracePeriod=600 Nov 26 22:42:30 crc kubenswrapper[5008]: I1126 22:42:30.636700 5008 generic.go:334] "Generic (PLEG): container finished" podID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerID="c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574" exitCode=0 Nov 26 22:42:30 crc kubenswrapper[5008]: I1126 22:42:30.636746 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerDied","Data":"c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574"} Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.667946 5008 generic.go:334] "Generic (PLEG): container finished" podID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" containerID="1af164d4859973141733955b706469ed4e4d3eb01e6c74fe52cb4c4f86c11ff6" exitCode=0 Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.668090 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqxrq" event={"ID":"012ef2d4-9b7c-4838-9b17-345ee0057cd1","Type":"ContainerDied","Data":"1af164d4859973141733955b706469ed4e4d3eb01e6c74fe52cb4c4f86c11ff6"} Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.678527 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5t8g6" event={"ID":"d66bc36d-82f1-4825-bb7d-a89eea9587e5","Type":"ContainerStarted","Data":"d3a166d6e8a93bc3a2b49f47f4e490515beed559ebf35bce98d60b484aae60d5"} Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.705592 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerStarted","Data":"2f4b44c1d16055de7e482db16e8660c7a628418920e436930bbd5729f4dfeb2b"} Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.727052 5008 generic.go:334] "Generic (PLEG): container finished" podID="77584e4e-5147-421a-b7c0-24c7403c003a" containerID="c0fc1dc25993d14ed01527fa4870ce6bded87f518f299fdf3876238f5306bdc5" exitCode=0 Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.727149 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9s8kp" event={"ID":"77584e4e-5147-421a-b7c0-24c7403c003a","Type":"ContainerDied","Data":"c0fc1dc25993d14ed01527fa4870ce6bded87f518f299fdf3876238f5306bdc5"} Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.730681 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7zbl" event={"ID":"c63d5fd4-0d21-44b4-862a-293ceb4321c9","Type":"ContainerStarted","Data":"b12374100ee123e18cd61cdeb710b12d4c0d8d31acd69abd86e6dbb1c483fa6c"} Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.740933 5008 generic.go:334] "Generic (PLEG): container finished" podID="278822ff-d2bf-46ef-af05-b6e32a132844" containerID="d53930ba9135a214717ba08646a2149522ddf6c278d319ce6d863e83f3699aa2" exitCode=0 Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.741038 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xqtk6" event={"ID":"278822ff-d2bf-46ef-af05-b6e32a132844","Type":"ContainerDied","Data":"d53930ba9135a214717ba08646a2149522ddf6c278d319ce6d863e83f3699aa2"} Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.746936 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xplkg" event={"ID":"7adf9a69-5de6-4710-b394-968387df9ae6","Type":"ContainerStarted","Data":"37024943c8aadc513fb5356c378a57e1ced18c23d81604bc68958c86b11bd449"} Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.750502 5008 generic.go:334] "Generic (PLEG): container finished" podID="3f7c489e-32bf-4761-a529-e8ca560145ad" containerID="0d7593ba68a326818344c18f053c68dc33e7cf72fc202f0355715f30dd27a50b" exitCode=0 Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.750556 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbtdl" event={"ID":"3f7c489e-32bf-4761-a529-e8ca560145ad","Type":"ContainerDied","Data":"0d7593ba68a326818344c18f053c68dc33e7cf72fc202f0355715f30dd27a50b"} Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.766925 5008 generic.go:334] "Generic (PLEG): container finished" podID="94caca18-eed8-46f1-818d-51433f83ae9c" containerID="dbd94e7872a8c5598f78811c09013dd705663b252142e44e158e4c2acd3dc365" exitCode=0 Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.766993 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccg9j" event={"ID":"94caca18-eed8-46f1-818d-51433f83ae9c","Type":"ContainerDied","Data":"dbd94e7872a8c5598f78811c09013dd705663b252142e44e158e4c2acd3dc365"} Nov 26 22:42:34 crc kubenswrapper[5008]: I1126 22:42:34.845177 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xplkg" podStartSLOduration=207.84515041 podStartE2EDuration="3m27.84515041s" podCreationTimestamp="2025-11-26 22:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:42:34.840038826 +0000 UTC m=+230.252732828" watchObservedRunningTime="2025-11-26 22:42:34.84515041 +0000 UTC m=+230.257844412" Nov 26 22:42:35 crc kubenswrapper[5008]: I1126 22:42:35.775935 5008 generic.go:334] "Generic (PLEG): container finished" podID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" containerID="d3a166d6e8a93bc3a2b49f47f4e490515beed559ebf35bce98d60b484aae60d5" exitCode=0 Nov 26 22:42:35 crc kubenswrapper[5008]: I1126 22:42:35.776021 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5t8g6" event={"ID":"d66bc36d-82f1-4825-bb7d-a89eea9587e5","Type":"ContainerDied","Data":"d3a166d6e8a93bc3a2b49f47f4e490515beed559ebf35bce98d60b484aae60d5"} Nov 26 22:42:35 crc kubenswrapper[5008]: I1126 22:42:35.779942 5008 generic.go:334] "Generic (PLEG): container finished" podID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" containerID="b12374100ee123e18cd61cdeb710b12d4c0d8d31acd69abd86e6dbb1c483fa6c" exitCode=0 Nov 26 22:42:35 crc kubenswrapper[5008]: I1126 22:42:35.780050 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7zbl" event={"ID":"c63d5fd4-0d21-44b4-862a-293ceb4321c9","Type":"ContainerDied","Data":"b12374100ee123e18cd61cdeb710b12d4c0d8d31acd69abd86e6dbb1c483fa6c"} Nov 26 22:42:35 crc kubenswrapper[5008]: I1126 22:42:35.782096 5008 generic.go:334] "Generic (PLEG): container finished" podID="2e4023c0-2da0-4ad1-8709-e855a934018c" containerID="8a628c2d27faf491e212c378f0f830a11d0ed84ef3560c3ecf48cbe076943b74" exitCode=0 Nov 26 22:42:35 crc kubenswrapper[5008]: I1126 22:42:35.783041 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkfvd" event={"ID":"2e4023c0-2da0-4ad1-8709-e855a934018c","Type":"ContainerDied","Data":"8a628c2d27faf491e212c378f0f830a11d0ed84ef3560c3ecf48cbe076943b74"} Nov 26 22:42:36 crc kubenswrapper[5008]: I1126 22:42:36.799832 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqxrq" event={"ID":"012ef2d4-9b7c-4838-9b17-345ee0057cd1","Type":"ContainerStarted","Data":"c3e24ae4a4aa24235118ba458112346b3c000ae395ab5e723c69d1aa14f818f6"} Nov 26 22:42:36 crc kubenswrapper[5008]: I1126 22:42:36.804198 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xqtk6" event={"ID":"278822ff-d2bf-46ef-af05-b6e32a132844","Type":"ContainerStarted","Data":"4bbfce20f22776a4336e0044b343b4879c838bf84613c02e4dd01452046b207a"} Nov 26 22:42:36 crc kubenswrapper[5008]: I1126 22:42:36.808527 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbtdl" event={"ID":"3f7c489e-32bf-4761-a529-e8ca560145ad","Type":"ContainerStarted","Data":"746688ce78192645402d2b74b01143ca831968ad708da95fcba266b0a9d059b4"} Nov 26 22:42:36 crc kubenswrapper[5008]: I1126 22:42:36.812050 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9s8kp" event={"ID":"77584e4e-5147-421a-b7c0-24c7403c003a","Type":"ContainerStarted","Data":"a273b99a8d55138fcf9edb805849f2a1a8845f0e32fe1cfa4d16c25b63c2bf7c"} Nov 26 22:42:36 crc kubenswrapper[5008]: I1126 22:42:36.847332 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hqxrq" podStartSLOduration=3.382385096 podStartE2EDuration="1m20.847312845s" podCreationTimestamp="2025-11-26 22:41:16 +0000 UTC" firstStartedPulling="2025-11-26 22:41:18.059887189 +0000 UTC m=+153.472581191" lastFinishedPulling="2025-11-26 22:42:35.524814938 +0000 UTC m=+230.937508940" observedRunningTime="2025-11-26 22:42:36.823066453 +0000 UTC m=+232.235760485" watchObservedRunningTime="2025-11-26 22:42:36.847312845 +0000 UTC m=+232.260006857" Nov 26 22:42:36 crc kubenswrapper[5008]: I1126 22:42:36.848278 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xqtk6" podStartSLOduration=2.725054941 podStartE2EDuration="1m20.848271826s" podCreationTimestamp="2025-11-26 22:41:16 +0000 UTC" firstStartedPulling="2025-11-26 22:41:18.050199779 +0000 UTC m=+153.462893781" lastFinishedPulling="2025-11-26 22:42:36.173416664 +0000 UTC m=+231.586110666" observedRunningTime="2025-11-26 22:42:36.846612883 +0000 UTC m=+232.259306905" watchObservedRunningTime="2025-11-26 22:42:36.848271826 +0000 UTC m=+232.260965848" Nov 26 22:42:36 crc kubenswrapper[5008]: I1126 22:42:36.869150 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9s8kp" podStartSLOduration=2.479579501 podStartE2EDuration="1m20.869106848s" podCreationTimestamp="2025-11-26 22:41:16 +0000 UTC" firstStartedPulling="2025-11-26 22:41:18.053699427 +0000 UTC m=+153.466393429" lastFinishedPulling="2025-11-26 22:42:36.443226774 +0000 UTC m=+231.855920776" observedRunningTime="2025-11-26 22:42:36.864989515 +0000 UTC m=+232.277683527" watchObservedRunningTime="2025-11-26 22:42:36.869106848 +0000 UTC m=+232.281800860" Nov 26 22:42:36 crc kubenswrapper[5008]: I1126 22:42:36.887160 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pbtdl" podStartSLOduration=4.008998348 podStartE2EDuration="1m21.88713628s" podCreationTimestamp="2025-11-26 22:41:15 +0000 UTC" firstStartedPulling="2025-11-26 22:41:18.052455781 +0000 UTC m=+153.465149783" lastFinishedPulling="2025-11-26 22:42:35.930593713 +0000 UTC m=+231.343287715" observedRunningTime="2025-11-26 22:42:36.885490696 +0000 UTC m=+232.298184718" watchObservedRunningTime="2025-11-26 22:42:36.88713628 +0000 UTC m=+232.299830292" Nov 26 22:42:36 crc kubenswrapper[5008]: I1126 22:42:36.941746 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:42:36 crc kubenswrapper[5008]: I1126 22:42:36.941930 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:42:37 crc kubenswrapper[5008]: I1126 22:42:37.818978 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkfvd" event={"ID":"2e4023c0-2da0-4ad1-8709-e855a934018c","Type":"ContainerStarted","Data":"177fff7ea6a08dad500398798ad06c26c87167954272ecddf0e77aa1819c75f3"} Nov 26 22:42:37 crc kubenswrapper[5008]: I1126 22:42:37.822198 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccg9j" event={"ID":"94caca18-eed8-46f1-818d-51433f83ae9c","Type":"ContainerStarted","Data":"57eca25b2796637689c808b56ae62b7a5ac9ebcfa4625d2fb06ebaceed9f019e"} Nov 26 22:42:37 crc kubenswrapper[5008]: I1126 22:42:37.824768 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5t8g6" event={"ID":"d66bc36d-82f1-4825-bb7d-a89eea9587e5","Type":"ContainerStarted","Data":"b404bc6877a34491a742a8bc95bfa59028e1dd3706cb64c13a03e3dd0c441f6a"} Nov 26 22:42:37 crc kubenswrapper[5008]: I1126 22:42:37.827214 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7zbl" event={"ID":"c63d5fd4-0d21-44b4-862a-293ceb4321c9","Type":"ContainerStarted","Data":"3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835"} Nov 26 22:42:37 crc kubenswrapper[5008]: I1126 22:42:37.842165 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mkfvd" podStartSLOduration=6.295862656 podStartE2EDuration="1m19.842147386s" podCreationTimestamp="2025-11-26 22:41:18 +0000 UTC" firstStartedPulling="2025-11-26 22:41:23.20876964 +0000 UTC m=+158.621463642" lastFinishedPulling="2025-11-26 22:42:36.75505433 +0000 UTC m=+232.167748372" observedRunningTime="2025-11-26 22:42:37.840189963 +0000 UTC m=+233.252883965" watchObservedRunningTime="2025-11-26 22:42:37.842147386 +0000 UTC m=+233.254841388" Nov 26 22:42:37 crc kubenswrapper[5008]: I1126 22:42:37.882713 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5t8g6" podStartSLOduration=4.680499429 podStartE2EDuration="1m18.882694664s" podCreationTimestamp="2025-11-26 22:41:19 +0000 UTC" firstStartedPulling="2025-11-26 22:41:23.204993595 +0000 UTC m=+158.617687587" lastFinishedPulling="2025-11-26 22:42:37.40718882 +0000 UTC m=+232.819882822" observedRunningTime="2025-11-26 22:42:37.881042661 +0000 UTC m=+233.293736663" watchObservedRunningTime="2025-11-26 22:42:37.882694664 +0000 UTC m=+233.295388666" Nov 26 22:42:37 crc kubenswrapper[5008]: I1126 22:42:37.883562 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t7zbl" podStartSLOduration=2.220349606 podStartE2EDuration="1m19.883556352s" podCreationTimestamp="2025-11-26 22:41:18 +0000 UTC" firstStartedPulling="2025-11-26 22:41:19.163900198 +0000 UTC m=+154.576594200" lastFinishedPulling="2025-11-26 22:42:36.827106914 +0000 UTC m=+232.239800946" observedRunningTime="2025-11-26 22:42:37.862601856 +0000 UTC m=+233.275295848" watchObservedRunningTime="2025-11-26 22:42:37.883556352 +0000 UTC m=+233.296250354" Nov 26 22:42:38 crc kubenswrapper[5008]: I1126 22:42:38.337680 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hqxrq" podUID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" containerName="registry-server" probeResult="failure" output=< Nov 26 22:42:38 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Nov 26 22:42:38 crc kubenswrapper[5008]: > Nov 26 22:42:38 crc kubenswrapper[5008]: I1126 22:42:38.535579 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:42:38 crc kubenswrapper[5008]: I1126 22:42:38.535674 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:42:38 crc kubenswrapper[5008]: I1126 22:42:38.920544 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:42:38 crc kubenswrapper[5008]: I1126 22:42:38.920604 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:42:38 crc kubenswrapper[5008]: I1126 22:42:38.962439 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:42:38 crc kubenswrapper[5008]: I1126 22:42:38.983564 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ccg9j" podStartSLOduration=6.134144617 podStartE2EDuration="1m19.983544814s" podCreationTimestamp="2025-11-26 22:41:19 +0000 UTC" firstStartedPulling="2025-11-26 22:41:23.206996151 +0000 UTC m=+158.619690173" lastFinishedPulling="2025-11-26 22:42:37.056396368 +0000 UTC m=+232.469090370" observedRunningTime="2025-11-26 22:42:37.90054489 +0000 UTC m=+233.313238902" watchObservedRunningTime="2025-11-26 22:42:38.983544814 +0000 UTC m=+234.396238816" Nov 26 22:42:39 crc kubenswrapper[5008]: I1126 22:42:39.565883 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:42:39 crc kubenswrapper[5008]: I1126 22:42:39.565940 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:42:39 crc kubenswrapper[5008]: I1126 22:42:39.590723 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-t7zbl" podUID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" containerName="registry-server" probeResult="failure" output=< Nov 26 22:42:39 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Nov 26 22:42:39 crc kubenswrapper[5008]: > Nov 26 22:42:39 crc kubenswrapper[5008]: I1126 22:42:39.923856 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:42:39 crc kubenswrapper[5008]: I1126 22:42:39.924004 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:42:40 crc kubenswrapper[5008]: I1126 22:42:40.617368 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5t8g6" podUID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" containerName="registry-server" probeResult="failure" output=< Nov 26 22:42:40 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Nov 26 22:42:40 crc kubenswrapper[5008]: > Nov 26 22:42:40 crc kubenswrapper[5008]: I1126 22:42:40.971844 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ccg9j" podUID="94caca18-eed8-46f1-818d-51433f83ae9c" containerName="registry-server" probeResult="failure" output=< Nov 26 22:42:40 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Nov 26 22:42:40 crc kubenswrapper[5008]: > Nov 26 22:42:46 crc kubenswrapper[5008]: I1126 22:42:46.324450 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:42:46 crc kubenswrapper[5008]: I1126 22:42:46.324762 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:42:46 crc kubenswrapper[5008]: I1126 22:42:46.379846 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:42:46 crc kubenswrapper[5008]: I1126 22:42:46.529313 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:42:46 crc kubenswrapper[5008]: I1126 22:42:46.529681 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:42:46 crc kubenswrapper[5008]: I1126 22:42:46.597338 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:42:46 crc kubenswrapper[5008]: I1126 22:42:46.734760 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:42:46 crc kubenswrapper[5008]: I1126 22:42:46.734796 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:42:46 crc kubenswrapper[5008]: I1126 22:42:46.803778 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:42:46 crc kubenswrapper[5008]: I1126 22:42:46.915697 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:42:46 crc kubenswrapper[5008]: I1126 22:42:46.916830 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:42:46 crc kubenswrapper[5008]: I1126 22:42:46.922326 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:42:46 crc kubenswrapper[5008]: I1126 22:42:46.985187 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:42:47 crc kubenswrapper[5008]: I1126 22:42:47.020439 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:42:48 crc kubenswrapper[5008]: I1126 22:42:48.209874 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqxrq"] Nov 26 22:42:48 crc kubenswrapper[5008]: I1126 22:42:48.569453 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:42:48 crc kubenswrapper[5008]: I1126 22:42:48.604832 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:42:48 crc kubenswrapper[5008]: I1126 22:42:48.883644 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hqxrq" podUID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" containerName="registry-server" containerID="cri-o://c3e24ae4a4aa24235118ba458112346b3c000ae395ab5e723c69d1aa14f818f6" gracePeriod=2 Nov 26 22:42:48 crc kubenswrapper[5008]: I1126 22:42:48.985644 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.214950 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9s8kp"] Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.215372 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9s8kp" podUID="77584e4e-5147-421a-b7c0-24c7403c003a" containerName="registry-server" containerID="cri-o://a273b99a8d55138fcf9edb805849f2a1a8845f0e32fe1cfa4d16c25b63c2bf7c" gracePeriod=2 Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.473227 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.577883 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5gnd\" (UniqueName: \"kubernetes.io/projected/012ef2d4-9b7c-4838-9b17-345ee0057cd1-kube-api-access-s5gnd\") pod \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\" (UID: \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\") " Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.577954 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012ef2d4-9b7c-4838-9b17-345ee0057cd1-catalog-content\") pod \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\" (UID: \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\") " Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.578010 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012ef2d4-9b7c-4838-9b17-345ee0057cd1-utilities\") pod \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\" (UID: \"012ef2d4-9b7c-4838-9b17-345ee0057cd1\") " Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.578884 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/012ef2d4-9b7c-4838-9b17-345ee0057cd1-utilities" (OuterVolumeSpecName: "utilities") pod "012ef2d4-9b7c-4838-9b17-345ee0057cd1" (UID: "012ef2d4-9b7c-4838-9b17-345ee0057cd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.583877 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/012ef2d4-9b7c-4838-9b17-345ee0057cd1-kube-api-access-s5gnd" (OuterVolumeSpecName: "kube-api-access-s5gnd") pod "012ef2d4-9b7c-4838-9b17-345ee0057cd1" (UID: "012ef2d4-9b7c-4838-9b17-345ee0057cd1"). InnerVolumeSpecName "kube-api-access-s5gnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.613690 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.639655 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/012ef2d4-9b7c-4838-9b17-345ee0057cd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "012ef2d4-9b7c-4838-9b17-345ee0057cd1" (UID: "012ef2d4-9b7c-4838-9b17-345ee0057cd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.675203 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.679324 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5gnd\" (UniqueName: \"kubernetes.io/projected/012ef2d4-9b7c-4838-9b17-345ee0057cd1-kube-api-access-s5gnd\") on node \"crc\" DevicePath \"\"" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.679353 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012ef2d4-9b7c-4838-9b17-345ee0057cd1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.679390 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012ef2d4-9b7c-4838-9b17-345ee0057cd1-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.734652 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.779825 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77584e4e-5147-421a-b7c0-24c7403c003a-utilities\") pod \"77584e4e-5147-421a-b7c0-24c7403c003a\" (UID: \"77584e4e-5147-421a-b7c0-24c7403c003a\") " Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.779918 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm4w6\" (UniqueName: \"kubernetes.io/projected/77584e4e-5147-421a-b7c0-24c7403c003a-kube-api-access-mm4w6\") pod \"77584e4e-5147-421a-b7c0-24c7403c003a\" (UID: \"77584e4e-5147-421a-b7c0-24c7403c003a\") " Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.779944 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77584e4e-5147-421a-b7c0-24c7403c003a-catalog-content\") pod \"77584e4e-5147-421a-b7c0-24c7403c003a\" (UID: \"77584e4e-5147-421a-b7c0-24c7403c003a\") " Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.781102 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77584e4e-5147-421a-b7c0-24c7403c003a-utilities" (OuterVolumeSpecName: "utilities") pod "77584e4e-5147-421a-b7c0-24c7403c003a" (UID: "77584e4e-5147-421a-b7c0-24c7403c003a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.784497 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77584e4e-5147-421a-b7c0-24c7403c003a-kube-api-access-mm4w6" (OuterVolumeSpecName: "kube-api-access-mm4w6") pod "77584e4e-5147-421a-b7c0-24c7403c003a" (UID: "77584e4e-5147-421a-b7c0-24c7403c003a"). InnerVolumeSpecName "kube-api-access-mm4w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.828337 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77584e4e-5147-421a-b7c0-24c7403c003a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77584e4e-5147-421a-b7c0-24c7403c003a" (UID: "77584e4e-5147-421a-b7c0-24c7403c003a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.880758 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm4w6\" (UniqueName: \"kubernetes.io/projected/77584e4e-5147-421a-b7c0-24c7403c003a-kube-api-access-mm4w6\") on node \"crc\" DevicePath \"\"" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.880799 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77584e4e-5147-421a-b7c0-24c7403c003a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.880808 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77584e4e-5147-421a-b7c0-24c7403c003a-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.890322 5008 generic.go:334] "Generic (PLEG): container finished" podID="77584e4e-5147-421a-b7c0-24c7403c003a" containerID="a273b99a8d55138fcf9edb805849f2a1a8845f0e32fe1cfa4d16c25b63c2bf7c" exitCode=0 Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.890396 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9s8kp" event={"ID":"77584e4e-5147-421a-b7c0-24c7403c003a","Type":"ContainerDied","Data":"a273b99a8d55138fcf9edb805849f2a1a8845f0e32fe1cfa4d16c25b63c2bf7c"} Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.890444 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9s8kp" event={"ID":"77584e4e-5147-421a-b7c0-24c7403c003a","Type":"ContainerDied","Data":"b7904422f275eaa2ccd91db1f37effb2dcf85daaee69df157daca731bb2f3a59"} Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.890461 5008 scope.go:117] "RemoveContainer" containerID="a273b99a8d55138fcf9edb805849f2a1a8845f0e32fe1cfa4d16c25b63c2bf7c" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.890579 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9s8kp" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.898414 5008 generic.go:334] "Generic (PLEG): container finished" podID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" containerID="c3e24ae4a4aa24235118ba458112346b3c000ae395ab5e723c69d1aa14f818f6" exitCode=0 Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.899158 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqxrq" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.899416 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqxrq" event={"ID":"012ef2d4-9b7c-4838-9b17-345ee0057cd1","Type":"ContainerDied","Data":"c3e24ae4a4aa24235118ba458112346b3c000ae395ab5e723c69d1aa14f818f6"} Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.899468 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqxrq" event={"ID":"012ef2d4-9b7c-4838-9b17-345ee0057cd1","Type":"ContainerDied","Data":"73ada0635782cac75900fda5c6af80fa481ec8217905aae0d48ebe0184fa3896"} Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.912878 5008 scope.go:117] "RemoveContainer" containerID="c0fc1dc25993d14ed01527fa4870ce6bded87f518f299fdf3876238f5306bdc5" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.931253 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9s8kp"] Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.939407 5008 scope.go:117] "RemoveContainer" containerID="6fe66479a1ef2566f2b33d069323211455bd174791bc9062e11262873f541897" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.940514 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9s8kp"] Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.943752 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqxrq"] Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.946253 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hqxrq"] Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.963470 5008 scope.go:117] "RemoveContainer" containerID="a273b99a8d55138fcf9edb805849f2a1a8845f0e32fe1cfa4d16c25b63c2bf7c" Nov 26 22:42:49 crc kubenswrapper[5008]: E1126 22:42:49.964107 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a273b99a8d55138fcf9edb805849f2a1a8845f0e32fe1cfa4d16c25b63c2bf7c\": container with ID starting with a273b99a8d55138fcf9edb805849f2a1a8845f0e32fe1cfa4d16c25b63c2bf7c not found: ID does not exist" containerID="a273b99a8d55138fcf9edb805849f2a1a8845f0e32fe1cfa4d16c25b63c2bf7c" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.964144 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a273b99a8d55138fcf9edb805849f2a1a8845f0e32fe1cfa4d16c25b63c2bf7c"} err="failed to get container status \"a273b99a8d55138fcf9edb805849f2a1a8845f0e32fe1cfa4d16c25b63c2bf7c\": rpc error: code = NotFound desc = could not find container \"a273b99a8d55138fcf9edb805849f2a1a8845f0e32fe1cfa4d16c25b63c2bf7c\": container with ID starting with a273b99a8d55138fcf9edb805849f2a1a8845f0e32fe1cfa4d16c25b63c2bf7c not found: ID does not exist" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.964171 5008 scope.go:117] "RemoveContainer" containerID="c0fc1dc25993d14ed01527fa4870ce6bded87f518f299fdf3876238f5306bdc5" Nov 26 22:42:49 crc kubenswrapper[5008]: E1126 22:42:49.964864 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0fc1dc25993d14ed01527fa4870ce6bded87f518f299fdf3876238f5306bdc5\": container with ID starting with c0fc1dc25993d14ed01527fa4870ce6bded87f518f299fdf3876238f5306bdc5 not found: ID does not exist" containerID="c0fc1dc25993d14ed01527fa4870ce6bded87f518f299fdf3876238f5306bdc5" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.964911 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0fc1dc25993d14ed01527fa4870ce6bded87f518f299fdf3876238f5306bdc5"} err="failed to get container status \"c0fc1dc25993d14ed01527fa4870ce6bded87f518f299fdf3876238f5306bdc5\": rpc error: code = NotFound desc = could not find container \"c0fc1dc25993d14ed01527fa4870ce6bded87f518f299fdf3876238f5306bdc5\": container with ID starting with c0fc1dc25993d14ed01527fa4870ce6bded87f518f299fdf3876238f5306bdc5 not found: ID does not exist" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.964944 5008 scope.go:117] "RemoveContainer" containerID="6fe66479a1ef2566f2b33d069323211455bd174791bc9062e11262873f541897" Nov 26 22:42:49 crc kubenswrapper[5008]: E1126 22:42:49.965408 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fe66479a1ef2566f2b33d069323211455bd174791bc9062e11262873f541897\": container with ID starting with 6fe66479a1ef2566f2b33d069323211455bd174791bc9062e11262873f541897 not found: ID does not exist" containerID="6fe66479a1ef2566f2b33d069323211455bd174791bc9062e11262873f541897" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.965440 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fe66479a1ef2566f2b33d069323211455bd174791bc9062e11262873f541897"} err="failed to get container status \"6fe66479a1ef2566f2b33d069323211455bd174791bc9062e11262873f541897\": rpc error: code = NotFound desc = could not find container \"6fe66479a1ef2566f2b33d069323211455bd174791bc9062e11262873f541897\": container with ID starting with 6fe66479a1ef2566f2b33d069323211455bd174791bc9062e11262873f541897 not found: ID does not exist" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.965462 5008 scope.go:117] "RemoveContainer" containerID="c3e24ae4a4aa24235118ba458112346b3c000ae395ab5e723c69d1aa14f818f6" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.966136 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:42:49 crc kubenswrapper[5008]: I1126 22:42:49.989064 5008 scope.go:117] "RemoveContainer" containerID="1af164d4859973141733955b706469ed4e4d3eb01e6c74fe52cb4c4f86c11ff6" Nov 26 22:42:50 crc kubenswrapper[5008]: I1126 22:42:50.017154 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:42:50 crc kubenswrapper[5008]: I1126 22:42:50.022569 5008 scope.go:117] "RemoveContainer" containerID="88989776b6848832c66d04d9f3d482fc34eadc49de88abece29f0a8c467edbd9" Nov 26 22:42:50 crc kubenswrapper[5008]: I1126 22:42:50.045895 5008 scope.go:117] "RemoveContainer" containerID="c3e24ae4a4aa24235118ba458112346b3c000ae395ab5e723c69d1aa14f818f6" Nov 26 22:42:50 crc kubenswrapper[5008]: E1126 22:42:50.046405 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3e24ae4a4aa24235118ba458112346b3c000ae395ab5e723c69d1aa14f818f6\": container with ID starting with c3e24ae4a4aa24235118ba458112346b3c000ae395ab5e723c69d1aa14f818f6 not found: ID does not exist" containerID="c3e24ae4a4aa24235118ba458112346b3c000ae395ab5e723c69d1aa14f818f6" Nov 26 22:42:50 crc kubenswrapper[5008]: I1126 22:42:50.046447 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3e24ae4a4aa24235118ba458112346b3c000ae395ab5e723c69d1aa14f818f6"} err="failed to get container status \"c3e24ae4a4aa24235118ba458112346b3c000ae395ab5e723c69d1aa14f818f6\": rpc error: code = NotFound desc = could not find container \"c3e24ae4a4aa24235118ba458112346b3c000ae395ab5e723c69d1aa14f818f6\": container with ID starting with c3e24ae4a4aa24235118ba458112346b3c000ae395ab5e723c69d1aa14f818f6 not found: ID does not exist" Nov 26 22:42:50 crc kubenswrapper[5008]: I1126 22:42:50.046473 5008 scope.go:117] "RemoveContainer" containerID="1af164d4859973141733955b706469ed4e4d3eb01e6c74fe52cb4c4f86c11ff6" Nov 26 22:42:50 crc kubenswrapper[5008]: E1126 22:42:50.046812 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1af164d4859973141733955b706469ed4e4d3eb01e6c74fe52cb4c4f86c11ff6\": container with ID starting with 1af164d4859973141733955b706469ed4e4d3eb01e6c74fe52cb4c4f86c11ff6 not found: ID does not exist" containerID="1af164d4859973141733955b706469ed4e4d3eb01e6c74fe52cb4c4f86c11ff6" Nov 26 22:42:50 crc kubenswrapper[5008]: I1126 22:42:50.046834 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1af164d4859973141733955b706469ed4e4d3eb01e6c74fe52cb4c4f86c11ff6"} err="failed to get container status \"1af164d4859973141733955b706469ed4e4d3eb01e6c74fe52cb4c4f86c11ff6\": rpc error: code = NotFound desc = could not find container \"1af164d4859973141733955b706469ed4e4d3eb01e6c74fe52cb4c4f86c11ff6\": container with ID starting with 1af164d4859973141733955b706469ed4e4d3eb01e6c74fe52cb4c4f86c11ff6 not found: ID does not exist" Nov 26 22:42:50 crc kubenswrapper[5008]: I1126 22:42:50.046847 5008 scope.go:117] "RemoveContainer" containerID="88989776b6848832c66d04d9f3d482fc34eadc49de88abece29f0a8c467edbd9" Nov 26 22:42:50 crc kubenswrapper[5008]: E1126 22:42:50.047109 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88989776b6848832c66d04d9f3d482fc34eadc49de88abece29f0a8c467edbd9\": container with ID starting with 88989776b6848832c66d04d9f3d482fc34eadc49de88abece29f0a8c467edbd9 not found: ID does not exist" containerID="88989776b6848832c66d04d9f3d482fc34eadc49de88abece29f0a8c467edbd9" Nov 26 22:42:50 crc kubenswrapper[5008]: I1126 22:42:50.047149 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88989776b6848832c66d04d9f3d482fc34eadc49de88abece29f0a8c467edbd9"} err="failed to get container status \"88989776b6848832c66d04d9f3d482fc34eadc49de88abece29f0a8c467edbd9\": rpc error: code = NotFound desc = could not find container \"88989776b6848832c66d04d9f3d482fc34eadc49de88abece29f0a8c467edbd9\": container with ID starting with 88989776b6848832c66d04d9f3d482fc34eadc49de88abece29f0a8c467edbd9 not found: ID does not exist" Nov 26 22:42:51 crc kubenswrapper[5008]: I1126 22:42:51.527051 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" path="/var/lib/kubelet/pods/012ef2d4-9b7c-4838-9b17-345ee0057cd1/volumes" Nov 26 22:42:51 crc kubenswrapper[5008]: I1126 22:42:51.528448 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77584e4e-5147-421a-b7c0-24c7403c003a" path="/var/lib/kubelet/pods/77584e4e-5147-421a-b7c0-24c7403c003a/volumes" Nov 26 22:42:51 crc kubenswrapper[5008]: I1126 22:42:51.610528 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkfvd"] Nov 26 22:42:51 crc kubenswrapper[5008]: I1126 22:42:51.610838 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mkfvd" podUID="2e4023c0-2da0-4ad1-8709-e855a934018c" containerName="registry-server" containerID="cri-o://177fff7ea6a08dad500398798ad06c26c87167954272ecddf0e77aa1819c75f3" gracePeriod=2 Nov 26 22:42:52 crc kubenswrapper[5008]: I1126 22:42:52.924906 5008 generic.go:334] "Generic (PLEG): container finished" podID="2e4023c0-2da0-4ad1-8709-e855a934018c" containerID="177fff7ea6a08dad500398798ad06c26c87167954272ecddf0e77aa1819c75f3" exitCode=0 Nov 26 22:42:52 crc kubenswrapper[5008]: I1126 22:42:52.924987 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkfvd" event={"ID":"2e4023c0-2da0-4ad1-8709-e855a934018c","Type":"ContainerDied","Data":"177fff7ea6a08dad500398798ad06c26c87167954272ecddf0e77aa1819c75f3"} Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.008818 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ccg9j"] Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.009284 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ccg9j" podUID="94caca18-eed8-46f1-818d-51433f83ae9c" containerName="registry-server" containerID="cri-o://57eca25b2796637689c808b56ae62b7a5ac9ebcfa4625d2fb06ebaceed9f019e" gracePeriod=2 Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.297001 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.455634 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4023c0-2da0-4ad1-8709-e855a934018c-catalog-content\") pod \"2e4023c0-2da0-4ad1-8709-e855a934018c\" (UID: \"2e4023c0-2da0-4ad1-8709-e855a934018c\") " Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.455760 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4023c0-2da0-4ad1-8709-e855a934018c-utilities\") pod \"2e4023c0-2da0-4ad1-8709-e855a934018c\" (UID: \"2e4023c0-2da0-4ad1-8709-e855a934018c\") " Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.455954 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngh8v\" (UniqueName: \"kubernetes.io/projected/2e4023c0-2da0-4ad1-8709-e855a934018c-kube-api-access-ngh8v\") pod \"2e4023c0-2da0-4ad1-8709-e855a934018c\" (UID: \"2e4023c0-2da0-4ad1-8709-e855a934018c\") " Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.456626 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e4023c0-2da0-4ad1-8709-e855a934018c-utilities" (OuterVolumeSpecName: "utilities") pod "2e4023c0-2da0-4ad1-8709-e855a934018c" (UID: "2e4023c0-2da0-4ad1-8709-e855a934018c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.456820 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4023c0-2da0-4ad1-8709-e855a934018c-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.465107 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4023c0-2da0-4ad1-8709-e855a934018c-kube-api-access-ngh8v" (OuterVolumeSpecName: "kube-api-access-ngh8v") pod "2e4023c0-2da0-4ad1-8709-e855a934018c" (UID: "2e4023c0-2da0-4ad1-8709-e855a934018c"). InnerVolumeSpecName "kube-api-access-ngh8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.471411 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e4023c0-2da0-4ad1-8709-e855a934018c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e4023c0-2da0-4ad1-8709-e855a934018c" (UID: "2e4023c0-2da0-4ad1-8709-e855a934018c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.557980 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4023c0-2da0-4ad1-8709-e855a934018c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.558017 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngh8v\" (UniqueName: \"kubernetes.io/projected/2e4023c0-2da0-4ad1-8709-e855a934018c-kube-api-access-ngh8v\") on node \"crc\" DevicePath \"\"" Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.939469 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkfvd" event={"ID":"2e4023c0-2da0-4ad1-8709-e855a934018c","Type":"ContainerDied","Data":"efb5e533363e50b21fc9c20a09644f5ce065f6d03bc2acb2356ac74f0c300f55"} Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.939850 5008 scope.go:117] "RemoveContainer" containerID="177fff7ea6a08dad500398798ad06c26c87167954272ecddf0e77aa1819c75f3" Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.939878 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkfvd" Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.942229 5008 generic.go:334] "Generic (PLEG): container finished" podID="94caca18-eed8-46f1-818d-51433f83ae9c" containerID="57eca25b2796637689c808b56ae62b7a5ac9ebcfa4625d2fb06ebaceed9f019e" exitCode=0 Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.942293 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccg9j" event={"ID":"94caca18-eed8-46f1-818d-51433f83ae9c","Type":"ContainerDied","Data":"57eca25b2796637689c808b56ae62b7a5ac9ebcfa4625d2fb06ebaceed9f019e"} Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.974656 5008 scope.go:117] "RemoveContainer" containerID="8a628c2d27faf491e212c378f0f830a11d0ed84ef3560c3ecf48cbe076943b74" Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.985361 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkfvd"] Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.988017 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkfvd"] Nov 26 22:42:54 crc kubenswrapper[5008]: I1126 22:42:54.990913 5008 scope.go:117] "RemoveContainer" containerID="5bfaf6ca0c6248e7aa8da4a05e0302b323fd206b26f8bb8279dbdef2f3ace0c0" Nov 26 22:42:55 crc kubenswrapper[5008]: I1126 22:42:55.526550 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e4023c0-2da0-4ad1-8709-e855a934018c" path="/var/lib/kubelet/pods/2e4023c0-2da0-4ad1-8709-e855a934018c/volumes" Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.241180 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.282115 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94caca18-eed8-46f1-818d-51433f83ae9c-utilities\") pod \"94caca18-eed8-46f1-818d-51433f83ae9c\" (UID: \"94caca18-eed8-46f1-818d-51433f83ae9c\") " Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.282172 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94caca18-eed8-46f1-818d-51433f83ae9c-catalog-content\") pod \"94caca18-eed8-46f1-818d-51433f83ae9c\" (UID: \"94caca18-eed8-46f1-818d-51433f83ae9c\") " Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.282308 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r2jn\" (UniqueName: \"kubernetes.io/projected/94caca18-eed8-46f1-818d-51433f83ae9c-kube-api-access-2r2jn\") pod \"94caca18-eed8-46f1-818d-51433f83ae9c\" (UID: \"94caca18-eed8-46f1-818d-51433f83ae9c\") " Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.284266 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94caca18-eed8-46f1-818d-51433f83ae9c-utilities" (OuterVolumeSpecName: "utilities") pod "94caca18-eed8-46f1-818d-51433f83ae9c" (UID: "94caca18-eed8-46f1-818d-51433f83ae9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.293479 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94caca18-eed8-46f1-818d-51433f83ae9c-kube-api-access-2r2jn" (OuterVolumeSpecName: "kube-api-access-2r2jn") pod "94caca18-eed8-46f1-818d-51433f83ae9c" (UID: "94caca18-eed8-46f1-818d-51433f83ae9c"). InnerVolumeSpecName "kube-api-access-2r2jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.384093 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r2jn\" (UniqueName: \"kubernetes.io/projected/94caca18-eed8-46f1-818d-51433f83ae9c-kube-api-access-2r2jn\") on node \"crc\" DevicePath \"\"" Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.384558 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94caca18-eed8-46f1-818d-51433f83ae9c-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.388339 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94caca18-eed8-46f1-818d-51433f83ae9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94caca18-eed8-46f1-818d-51433f83ae9c" (UID: "94caca18-eed8-46f1-818d-51433f83ae9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.486086 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94caca18-eed8-46f1-818d-51433f83ae9c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.955630 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccg9j" event={"ID":"94caca18-eed8-46f1-818d-51433f83ae9c","Type":"ContainerDied","Data":"8e551cf9f526dc582248f43b650895463870e77d77d174e8417fe5d9dd69f8af"} Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.955678 5008 scope.go:117] "RemoveContainer" containerID="57eca25b2796637689c808b56ae62b7a5ac9ebcfa4625d2fb06ebaceed9f019e" Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.955792 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccg9j" Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.988154 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ccg9j"] Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.995470 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ccg9j"] Nov 26 22:42:56 crc kubenswrapper[5008]: I1126 22:42:56.996476 5008 scope.go:117] "RemoveContainer" containerID="dbd94e7872a8c5598f78811c09013dd705663b252142e44e158e4c2acd3dc365" Nov 26 22:42:57 crc kubenswrapper[5008]: I1126 22:42:57.027306 5008 scope.go:117] "RemoveContainer" containerID="7f39b45e537c81455d246567bc429140a911db5e87e6db691b9b6e975590a514" Nov 26 22:42:57 crc kubenswrapper[5008]: I1126 22:42:57.527534 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94caca18-eed8-46f1-818d-51433f83ae9c" path="/var/lib/kubelet/pods/94caca18-eed8-46f1-818d-51433f83ae9c/volumes" Nov 26 22:43:29 crc kubenswrapper[5008]: I1126 22:43:29.224713 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k77kt"] Nov 26 22:43:45 crc kubenswrapper[5008]: I1126 22:43:45.274260 5008 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.247623 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" podUID="1cf78e11-8113-4dd4-9b76-182452887bf3" containerName="oauth-openshift" containerID="cri-o://3cdd410a22c15727acd0d55677a49415abc4b83968d2b0c70973b21d2cd9c267" gracePeriod=15 Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.729330 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770353 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5969b76fdc-pcwg6"] Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770607 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77584e4e-5147-421a-b7c0-24c7403c003a" containerName="registry-server" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770625 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="77584e4e-5147-421a-b7c0-24c7403c003a" containerName="registry-server" Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770643 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77584e4e-5147-421a-b7c0-24c7403c003a" containerName="extract-utilities" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770651 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="77584e4e-5147-421a-b7c0-24c7403c003a" containerName="extract-utilities" Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770660 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4023c0-2da0-4ad1-8709-e855a934018c" containerName="extract-utilities" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770668 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4023c0-2da0-4ad1-8709-e855a934018c" containerName="extract-utilities" Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770679 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94caca18-eed8-46f1-818d-51433f83ae9c" containerName="extract-utilities" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770687 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="94caca18-eed8-46f1-818d-51433f83ae9c" containerName="extract-utilities" Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770696 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" containerName="registry-server" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770705 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" containerName="registry-server" Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770719 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4023c0-2da0-4ad1-8709-e855a934018c" containerName="registry-server" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770727 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4023c0-2da0-4ad1-8709-e855a934018c" containerName="registry-server" Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770740 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a237a3-af26-4376-a287-223050c2334a" containerName="pruner" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770747 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a237a3-af26-4376-a287-223050c2334a" containerName="pruner" Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770758 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94caca18-eed8-46f1-818d-51433f83ae9c" containerName="extract-content" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770765 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="94caca18-eed8-46f1-818d-51433f83ae9c" containerName="extract-content" Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770775 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77584e4e-5147-421a-b7c0-24c7403c003a" containerName="extract-content" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770783 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="77584e4e-5147-421a-b7c0-24c7403c003a" containerName="extract-content" Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770794 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf78e11-8113-4dd4-9b76-182452887bf3" containerName="oauth-openshift" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770801 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf78e11-8113-4dd4-9b76-182452887bf3" containerName="oauth-openshift" Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770812 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94caca18-eed8-46f1-818d-51433f83ae9c" containerName="registry-server" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770821 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="94caca18-eed8-46f1-818d-51433f83ae9c" containerName="registry-server" Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770830 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4023c0-2da0-4ad1-8709-e855a934018c" containerName="extract-content" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770838 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4023c0-2da0-4ad1-8709-e855a934018c" containerName="extract-content" Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770850 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" containerName="extract-content" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770857 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" containerName="extract-content" Nov 26 22:43:54 crc kubenswrapper[5008]: E1126 22:43:54.770867 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" containerName="extract-utilities" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.770875 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" containerName="extract-utilities" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.771014 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e4023c0-2da0-4ad1-8709-e855a934018c" containerName="registry-server" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.771032 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="94caca18-eed8-46f1-818d-51433f83ae9c" containerName="registry-server" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.771042 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cf78e11-8113-4dd4-9b76-182452887bf3" containerName="oauth-openshift" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.771056 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="012ef2d4-9b7c-4838-9b17-345ee0057cd1" containerName="registry-server" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.771071 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="77584e4e-5147-421a-b7c0-24c7403c003a" containerName="registry-server" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.771080 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a237a3-af26-4376-a287-223050c2334a" containerName="pruner" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.771495 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.787142 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5969b76fdc-pcwg6"] Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.828879 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8z8d\" (UniqueName: \"kubernetes.io/projected/1cf78e11-8113-4dd4-9b76-182452887bf3-kube-api-access-c8z8d\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.828991 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1cf78e11-8113-4dd4-9b76-182452887bf3-audit-dir\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829055 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-ocp-branding-template\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829062 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1cf78e11-8113-4dd4-9b76-182452887bf3-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829092 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-idp-0-file-data\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829129 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-serving-cert\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829170 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-login\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829206 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-session\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829245 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-cliconfig\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829279 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-provider-selection\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829311 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-audit-policies\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829345 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-service-ca\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829379 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-trusted-ca-bundle\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829414 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-router-certs\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829451 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-error\") pod \"1cf78e11-8113-4dd4-9b76-182452887bf3\" (UID: \"1cf78e11-8113-4dd4-9b76-182452887bf3\") " Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829555 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829603 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-session\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829633 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829666 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829696 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-router-certs\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829726 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829765 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-user-template-login\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829803 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-user-template-error\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829842 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-audit-policies\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.829872 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljz4h\" (UniqueName: \"kubernetes.io/projected/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-kube-api-access-ljz4h\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.830115 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-service-ca\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.830813 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-audit-dir\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.830532 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.830630 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.831141 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.831217 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.831630 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.831943 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.832161 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.832299 5008 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1cf78e11-8113-4dd4-9b76-182452887bf3-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.832398 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.832485 5008 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.832564 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.835742 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.835955 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.836530 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.836555 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cf78e11-8113-4dd4-9b76-182452887bf3-kube-api-access-c8z8d" (OuterVolumeSpecName: "kube-api-access-c8z8d") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "kube-api-access-c8z8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.836919 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.841325 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.841481 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.841662 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.842165 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "1cf78e11-8113-4dd4-9b76-182452887bf3" (UID: "1cf78e11-8113-4dd4-9b76-182452887bf3"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.933486 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-user-template-login\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.933782 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-user-template-error\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.933921 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-audit-policies\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.934051 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljz4h\" (UniqueName: \"kubernetes.io/projected/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-kube-api-access-ljz4h\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.934180 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-service-ca\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.934293 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-audit-dir\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.934412 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-audit-dir\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.934582 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.934749 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.934915 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-service-ca\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.935185 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.935529 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-session\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.935779 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.935917 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.936046 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-router-certs\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.936181 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.936330 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.936420 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.936556 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.936681 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.936907 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.937117 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.937281 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.937420 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.935857 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-audit-policies\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.935304 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.937635 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8z8d\" (UniqueName: \"kubernetes.io/projected/1cf78e11-8113-4dd4-9b76-182452887bf3-kube-api-access-c8z8d\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.937755 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1cf78e11-8113-4dd4-9b76-182452887bf3-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.938858 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-user-template-error\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.939420 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.939761 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.940130 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.940321 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-user-template-login\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.941617 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-session\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.942615 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-router-certs\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.943324 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:54 crc kubenswrapper[5008]: I1126 22:43:54.949453 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljz4h\" (UniqueName: \"kubernetes.io/projected/59a77a60-f253-40e0-9a84-4fff3a3ff9b5-kube-api-access-ljz4h\") pod \"oauth-openshift-5969b76fdc-pcwg6\" (UID: \"59a77a60-f253-40e0-9a84-4fff3a3ff9b5\") " pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:55 crc kubenswrapper[5008]: I1126 22:43:55.089833 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:55 crc kubenswrapper[5008]: I1126 22:43:55.322301 5008 generic.go:334] "Generic (PLEG): container finished" podID="1cf78e11-8113-4dd4-9b76-182452887bf3" containerID="3cdd410a22c15727acd0d55677a49415abc4b83968d2b0c70973b21d2cd9c267" exitCode=0 Nov 26 22:43:55 crc kubenswrapper[5008]: I1126 22:43:55.322387 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" Nov 26 22:43:55 crc kubenswrapper[5008]: I1126 22:43:55.322412 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" event={"ID":"1cf78e11-8113-4dd4-9b76-182452887bf3","Type":"ContainerDied","Data":"3cdd410a22c15727acd0d55677a49415abc4b83968d2b0c70973b21d2cd9c267"} Nov 26 22:43:55 crc kubenswrapper[5008]: I1126 22:43:55.323452 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k77kt" event={"ID":"1cf78e11-8113-4dd4-9b76-182452887bf3","Type":"ContainerDied","Data":"e51ffbd72c01fdbdf6aa80194fbf271954ca3977b1042e1addd683fb79b360d7"} Nov 26 22:43:55 crc kubenswrapper[5008]: I1126 22:43:55.323513 5008 scope.go:117] "RemoveContainer" containerID="3cdd410a22c15727acd0d55677a49415abc4b83968d2b0c70973b21d2cd9c267" Nov 26 22:43:55 crc kubenswrapper[5008]: I1126 22:43:55.349145 5008 scope.go:117] "RemoveContainer" containerID="3cdd410a22c15727acd0d55677a49415abc4b83968d2b0c70973b21d2cd9c267" Nov 26 22:43:55 crc kubenswrapper[5008]: E1126 22:43:55.349686 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cdd410a22c15727acd0d55677a49415abc4b83968d2b0c70973b21d2cd9c267\": container with ID starting with 3cdd410a22c15727acd0d55677a49415abc4b83968d2b0c70973b21d2cd9c267 not found: ID does not exist" containerID="3cdd410a22c15727acd0d55677a49415abc4b83968d2b0c70973b21d2cd9c267" Nov 26 22:43:55 crc kubenswrapper[5008]: I1126 22:43:55.349730 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cdd410a22c15727acd0d55677a49415abc4b83968d2b0c70973b21d2cd9c267"} err="failed to get container status \"3cdd410a22c15727acd0d55677a49415abc4b83968d2b0c70973b21d2cd9c267\": rpc error: code = NotFound desc = could not find container \"3cdd410a22c15727acd0d55677a49415abc4b83968d2b0c70973b21d2cd9c267\": container with ID starting with 3cdd410a22c15727acd0d55677a49415abc4b83968d2b0c70973b21d2cd9c267 not found: ID does not exist" Nov 26 22:43:55 crc kubenswrapper[5008]: I1126 22:43:55.374535 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k77kt"] Nov 26 22:43:55 crc kubenswrapper[5008]: I1126 22:43:55.378437 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k77kt"] Nov 26 22:43:55 crc kubenswrapper[5008]: I1126 22:43:55.381826 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5969b76fdc-pcwg6"] Nov 26 22:43:55 crc kubenswrapper[5008]: W1126 22:43:55.385520 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59a77a60_f253_40e0_9a84_4fff3a3ff9b5.slice/crio-79c1cc7f43a8e0eb2737c4073ebabc355e3f9a176b4f1081b74aac4aa6aba44d WatchSource:0}: Error finding container 79c1cc7f43a8e0eb2737c4073ebabc355e3f9a176b4f1081b74aac4aa6aba44d: Status 404 returned error can't find the container with id 79c1cc7f43a8e0eb2737c4073ebabc355e3f9a176b4f1081b74aac4aa6aba44d Nov 26 22:43:55 crc kubenswrapper[5008]: I1126 22:43:55.528090 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cf78e11-8113-4dd4-9b76-182452887bf3" path="/var/lib/kubelet/pods/1cf78e11-8113-4dd4-9b76-182452887bf3/volumes" Nov 26 22:43:56 crc kubenswrapper[5008]: I1126 22:43:56.339454 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" event={"ID":"59a77a60-f253-40e0-9a84-4fff3a3ff9b5","Type":"ContainerStarted","Data":"6dd23be7cb3a9fd3f8a4778558dab790bdfdf62a72e4268a88f6235318614824"} Nov 26 22:43:56 crc kubenswrapper[5008]: I1126 22:43:56.339560 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" event={"ID":"59a77a60-f253-40e0-9a84-4fff3a3ff9b5","Type":"ContainerStarted","Data":"79c1cc7f43a8e0eb2737c4073ebabc355e3f9a176b4f1081b74aac4aa6aba44d"} Nov 26 22:43:56 crc kubenswrapper[5008]: I1126 22:43:56.339861 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:43:56 crc kubenswrapper[5008]: I1126 22:43:56.376357 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" podStartSLOduration=27.376336859 podStartE2EDuration="27.376336859s" podCreationTimestamp="2025-11-26 22:43:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:43:56.375137381 +0000 UTC m=+311.787831433" watchObservedRunningTime="2025-11-26 22:43:56.376336859 +0000 UTC m=+311.789030881" Nov 26 22:43:56 crc kubenswrapper[5008]: I1126 22:43:56.530578 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5969b76fdc-pcwg6" Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.765184 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xqtk6"] Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.766416 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xqtk6" podUID="278822ff-d2bf-46ef-af05-b6e32a132844" containerName="registry-server" containerID="cri-o://4bbfce20f22776a4336e0044b343b4879c838bf84613c02e4dd01452046b207a" gracePeriod=30 Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.780662 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pbtdl"] Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.781026 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pbtdl" podUID="3f7c489e-32bf-4761-a529-e8ca560145ad" containerName="registry-server" containerID="cri-o://746688ce78192645402d2b74b01143ca831968ad708da95fcba266b0a9d059b4" gracePeriod=30 Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.796084 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rwq8l"] Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.796337 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" podUID="97073e75-5bda-4dc4-8d80-bf408068aaef" containerName="marketplace-operator" containerID="cri-o://c8df45a267dc1bcbf7bf592bb2f4358771c356d549006d2d30985fa9e3171238" gracePeriod=30 Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.810362 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gd9vw"] Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.811719 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.823100 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7zbl"] Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.823727 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t7zbl" podUID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" containerName="registry-server" containerID="cri-o://3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835" gracePeriod=30 Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.836163 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gd9vw"] Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.842799 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5t8g6"] Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.843081 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5t8g6" podUID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" containerName="registry-server" containerID="cri-o://b404bc6877a34491a742a8bc95bfa59028e1dd3706cb64c13a03e3dd0c441f6a" gracePeriod=30 Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.991519 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8b99\" (UniqueName: \"kubernetes.io/projected/4f4cad47-dfc0-4c9d-9255-58c022732fb7-kube-api-access-q8b99\") pod \"marketplace-operator-79b997595-gd9vw\" (UID: \"4f4cad47-dfc0-4c9d-9255-58c022732fb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.991590 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4f4cad47-dfc0-4c9d-9255-58c022732fb7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gd9vw\" (UID: \"4f4cad47-dfc0-4c9d-9255-58c022732fb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" Nov 26 22:44:28 crc kubenswrapper[5008]: I1126 22:44:28.991684 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f4cad47-dfc0-4c9d-9255-58c022732fb7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gd9vw\" (UID: \"4f4cad47-dfc0-4c9d-9255-58c022732fb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.100816 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8b99\" (UniqueName: \"kubernetes.io/projected/4f4cad47-dfc0-4c9d-9255-58c022732fb7-kube-api-access-q8b99\") pod \"marketplace-operator-79b997595-gd9vw\" (UID: \"4f4cad47-dfc0-4c9d-9255-58c022732fb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.100891 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4f4cad47-dfc0-4c9d-9255-58c022732fb7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gd9vw\" (UID: \"4f4cad47-dfc0-4c9d-9255-58c022732fb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.101066 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f4cad47-dfc0-4c9d-9255-58c022732fb7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gd9vw\" (UID: \"4f4cad47-dfc0-4c9d-9255-58c022732fb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.104939 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f4cad47-dfc0-4c9d-9255-58c022732fb7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gd9vw\" (UID: \"4f4cad47-dfc0-4c9d-9255-58c022732fb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.106879 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4f4cad47-dfc0-4c9d-9255-58c022732fb7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gd9vw\" (UID: \"4f4cad47-dfc0-4c9d-9255-58c022732fb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.115798 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8b99\" (UniqueName: \"kubernetes.io/projected/4f4cad47-dfc0-4c9d-9255-58c022732fb7-kube-api-access-q8b99\") pod \"marketplace-operator-79b997595-gd9vw\" (UID: \"4f4cad47-dfc0-4c9d-9255-58c022732fb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.117108 5008 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc63d5fd4_0d21_44b4_862a_293ceb4321c9.slice/crio-conmon-3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc63d5fd4_0d21_44b4_862a_293ceb4321c9.slice/crio-3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835.scope\": RecentStats: unable to find data in memory cache]" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.146256 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.217088 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.228723 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.250156 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.278467 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.282094 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304403 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c63d5fd4-0d21-44b4-862a-293ceb4321c9-catalog-content\") pod \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\" (UID: \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304451 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd7r7\" (UniqueName: \"kubernetes.io/projected/d66bc36d-82f1-4825-bb7d-a89eea9587e5-kube-api-access-bd7r7\") pod \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\" (UID: \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304474 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97073e75-5bda-4dc4-8d80-bf408068aaef-marketplace-trusted-ca\") pod \"97073e75-5bda-4dc4-8d80-bf408068aaef\" (UID: \"97073e75-5bda-4dc4-8d80-bf408068aaef\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304489 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngr7s\" (UniqueName: \"kubernetes.io/projected/3f7c489e-32bf-4761-a529-e8ca560145ad-kube-api-access-ngr7s\") pod \"3f7c489e-32bf-4761-a529-e8ca560145ad\" (UID: \"3f7c489e-32bf-4761-a529-e8ca560145ad\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304508 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f7c489e-32bf-4761-a529-e8ca560145ad-catalog-content\") pod \"3f7c489e-32bf-4761-a529-e8ca560145ad\" (UID: \"3f7c489e-32bf-4761-a529-e8ca560145ad\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304751 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7csd\" (UniqueName: \"kubernetes.io/projected/97073e75-5bda-4dc4-8d80-bf408068aaef-kube-api-access-s7csd\") pod \"97073e75-5bda-4dc4-8d80-bf408068aaef\" (UID: \"97073e75-5bda-4dc4-8d80-bf408068aaef\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304779 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f7c489e-32bf-4761-a529-e8ca560145ad-utilities\") pod \"3f7c489e-32bf-4761-a529-e8ca560145ad\" (UID: \"3f7c489e-32bf-4761-a529-e8ca560145ad\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304795 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278822ff-d2bf-46ef-af05-b6e32a132844-catalog-content\") pod \"278822ff-d2bf-46ef-af05-b6e32a132844\" (UID: \"278822ff-d2bf-46ef-af05-b6e32a132844\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304825 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97073e75-5bda-4dc4-8d80-bf408068aaef-marketplace-operator-metrics\") pod \"97073e75-5bda-4dc4-8d80-bf408068aaef\" (UID: \"97073e75-5bda-4dc4-8d80-bf408068aaef\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304856 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66bc36d-82f1-4825-bb7d-a89eea9587e5-catalog-content\") pod \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\" (UID: \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304876 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c63d5fd4-0d21-44b4-862a-293ceb4321c9-utilities\") pod \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\" (UID: \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304906 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278822ff-d2bf-46ef-af05-b6e32a132844-utilities\") pod \"278822ff-d2bf-46ef-af05-b6e32a132844\" (UID: \"278822ff-d2bf-46ef-af05-b6e32a132844\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304938 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qmpj\" (UniqueName: \"kubernetes.io/projected/278822ff-d2bf-46ef-af05-b6e32a132844-kube-api-access-6qmpj\") pod \"278822ff-d2bf-46ef-af05-b6e32a132844\" (UID: \"278822ff-d2bf-46ef-af05-b6e32a132844\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.304988 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66bc36d-82f1-4825-bb7d-a89eea9587e5-utilities\") pod \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\" (UID: \"d66bc36d-82f1-4825-bb7d-a89eea9587e5\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.305242 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sxqc\" (UniqueName: \"kubernetes.io/projected/c63d5fd4-0d21-44b4-862a-293ceb4321c9-kube-api-access-9sxqc\") pod \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\" (UID: \"c63d5fd4-0d21-44b4-862a-293ceb4321c9\") " Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.305771 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f7c489e-32bf-4761-a529-e8ca560145ad-utilities" (OuterVolumeSpecName: "utilities") pod "3f7c489e-32bf-4761-a529-e8ca560145ad" (UID: "3f7c489e-32bf-4761-a529-e8ca560145ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.315235 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7c489e-32bf-4761-a529-e8ca560145ad-kube-api-access-ngr7s" (OuterVolumeSpecName: "kube-api-access-ngr7s") pod "3f7c489e-32bf-4761-a529-e8ca560145ad" (UID: "3f7c489e-32bf-4761-a529-e8ca560145ad"). InnerVolumeSpecName "kube-api-access-ngr7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.318673 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c63d5fd4-0d21-44b4-862a-293ceb4321c9-kube-api-access-9sxqc" (OuterVolumeSpecName: "kube-api-access-9sxqc") pod "c63d5fd4-0d21-44b4-862a-293ceb4321c9" (UID: "c63d5fd4-0d21-44b4-862a-293ceb4321c9"). InnerVolumeSpecName "kube-api-access-9sxqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.318782 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97073e75-5bda-4dc4-8d80-bf408068aaef-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "97073e75-5bda-4dc4-8d80-bf408068aaef" (UID: "97073e75-5bda-4dc4-8d80-bf408068aaef"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.319242 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c63d5fd4-0d21-44b4-862a-293ceb4321c9-utilities" (OuterVolumeSpecName: "utilities") pod "c63d5fd4-0d21-44b4-862a-293ceb4321c9" (UID: "c63d5fd4-0d21-44b4-862a-293ceb4321c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.319324 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66bc36d-82f1-4825-bb7d-a89eea9587e5-utilities" (OuterVolumeSpecName: "utilities") pod "d66bc36d-82f1-4825-bb7d-a89eea9587e5" (UID: "d66bc36d-82f1-4825-bb7d-a89eea9587e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.319514 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/278822ff-d2bf-46ef-af05-b6e32a132844-utilities" (OuterVolumeSpecName: "utilities") pod "278822ff-d2bf-46ef-af05-b6e32a132844" (UID: "278822ff-d2bf-46ef-af05-b6e32a132844"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.327055 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66bc36d-82f1-4825-bb7d-a89eea9587e5-kube-api-access-bd7r7" (OuterVolumeSpecName: "kube-api-access-bd7r7") pod "d66bc36d-82f1-4825-bb7d-a89eea9587e5" (UID: "d66bc36d-82f1-4825-bb7d-a89eea9587e5"). InnerVolumeSpecName "kube-api-access-bd7r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.327124 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97073e75-5bda-4dc4-8d80-bf408068aaef-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "97073e75-5bda-4dc4-8d80-bf408068aaef" (UID: "97073e75-5bda-4dc4-8d80-bf408068aaef"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.330558 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278822ff-d2bf-46ef-af05-b6e32a132844-kube-api-access-6qmpj" (OuterVolumeSpecName: "kube-api-access-6qmpj") pod "278822ff-d2bf-46ef-af05-b6e32a132844" (UID: "278822ff-d2bf-46ef-af05-b6e32a132844"). InnerVolumeSpecName "kube-api-access-6qmpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.331660 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97073e75-5bda-4dc4-8d80-bf408068aaef-kube-api-access-s7csd" (OuterVolumeSpecName: "kube-api-access-s7csd") pod "97073e75-5bda-4dc4-8d80-bf408068aaef" (UID: "97073e75-5bda-4dc4-8d80-bf408068aaef"). InnerVolumeSpecName "kube-api-access-s7csd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.348652 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c63d5fd4-0d21-44b4-862a-293ceb4321c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c63d5fd4-0d21-44b4-862a-293ceb4321c9" (UID: "c63d5fd4-0d21-44b4-862a-293ceb4321c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.381843 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/278822ff-d2bf-46ef-af05-b6e32a132844-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "278822ff-d2bf-46ef-af05-b6e32a132844" (UID: "278822ff-d2bf-46ef-af05-b6e32a132844"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.405688 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f7c489e-32bf-4761-a529-e8ca560145ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f7c489e-32bf-4761-a529-e8ca560145ad" (UID: "3f7c489e-32bf-4761-a529-e8ca560145ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406302 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66bc36d-82f1-4825-bb7d-a89eea9587e5-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406322 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sxqc\" (UniqueName: \"kubernetes.io/projected/c63d5fd4-0d21-44b4-862a-293ceb4321c9-kube-api-access-9sxqc\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406332 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c63d5fd4-0d21-44b4-862a-293ceb4321c9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406340 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd7r7\" (UniqueName: \"kubernetes.io/projected/d66bc36d-82f1-4825-bb7d-a89eea9587e5-kube-api-access-bd7r7\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406349 5008 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97073e75-5bda-4dc4-8d80-bf408068aaef-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406358 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngr7s\" (UniqueName: \"kubernetes.io/projected/3f7c489e-32bf-4761-a529-e8ca560145ad-kube-api-access-ngr7s\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406366 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f7c489e-32bf-4761-a529-e8ca560145ad-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406377 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7csd\" (UniqueName: \"kubernetes.io/projected/97073e75-5bda-4dc4-8d80-bf408068aaef-kube-api-access-s7csd\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406385 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f7c489e-32bf-4761-a529-e8ca560145ad-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406397 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278822ff-d2bf-46ef-af05-b6e32a132844-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406406 5008 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97073e75-5bda-4dc4-8d80-bf408068aaef-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406417 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c63d5fd4-0d21-44b4-862a-293ceb4321c9-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406428 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278822ff-d2bf-46ef-af05-b6e32a132844-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.406436 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qmpj\" (UniqueName: \"kubernetes.io/projected/278822ff-d2bf-46ef-af05-b6e32a132844-kube-api-access-6qmpj\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.412759 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66bc36d-82f1-4825-bb7d-a89eea9587e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d66bc36d-82f1-4825-bb7d-a89eea9587e5" (UID: "d66bc36d-82f1-4825-bb7d-a89eea9587e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.507481 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66bc36d-82f1-4825-bb7d-a89eea9587e5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.532525 5008 generic.go:334] "Generic (PLEG): container finished" podID="3f7c489e-32bf-4761-a529-e8ca560145ad" containerID="746688ce78192645402d2b74b01143ca831968ad708da95fcba266b0a9d059b4" exitCode=0 Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.532584 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbtdl" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.532578 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbtdl" event={"ID":"3f7c489e-32bf-4761-a529-e8ca560145ad","Type":"ContainerDied","Data":"746688ce78192645402d2b74b01143ca831968ad708da95fcba266b0a9d059b4"} Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.532717 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbtdl" event={"ID":"3f7c489e-32bf-4761-a529-e8ca560145ad","Type":"ContainerDied","Data":"34a3ffa0c91bb11053a3f2a3ae1e91d3c10f467743d66f71b1481a048462d11b"} Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.532741 5008 scope.go:117] "RemoveContainer" containerID="746688ce78192645402d2b74b01143ca831968ad708da95fcba266b0a9d059b4" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.534368 5008 generic.go:334] "Generic (PLEG): container finished" podID="97073e75-5bda-4dc4-8d80-bf408068aaef" containerID="c8df45a267dc1bcbf7bf592bb2f4358771c356d549006d2d30985fa9e3171238" exitCode=0 Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.534408 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" event={"ID":"97073e75-5bda-4dc4-8d80-bf408068aaef","Type":"ContainerDied","Data":"c8df45a267dc1bcbf7bf592bb2f4358771c356d549006d2d30985fa9e3171238"} Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.534405 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.534532 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rwq8l" event={"ID":"97073e75-5bda-4dc4-8d80-bf408068aaef","Type":"ContainerDied","Data":"0aab8801d460828a58c4946b04438e5184f808f3343d13d742b94331354c5084"} Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.537178 5008 generic.go:334] "Generic (PLEG): container finished" podID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" containerID="b404bc6877a34491a742a8bc95bfa59028e1dd3706cb64c13a03e3dd0c441f6a" exitCode=0 Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.537291 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5t8g6" event={"ID":"d66bc36d-82f1-4825-bb7d-a89eea9587e5","Type":"ContainerDied","Data":"b404bc6877a34491a742a8bc95bfa59028e1dd3706cb64c13a03e3dd0c441f6a"} Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.537323 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5t8g6" event={"ID":"d66bc36d-82f1-4825-bb7d-a89eea9587e5","Type":"ContainerDied","Data":"faa8d9d66693982b4caeaf2ce53f1987a419c52a55e1115cc14616390913e8d1"} Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.537361 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5t8g6" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.540995 5008 generic.go:334] "Generic (PLEG): container finished" podID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" containerID="3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835" exitCode=0 Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.541053 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t7zbl" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.541111 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7zbl" event={"ID":"c63d5fd4-0d21-44b4-862a-293ceb4321c9","Type":"ContainerDied","Data":"3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835"} Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.541137 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7zbl" event={"ID":"c63d5fd4-0d21-44b4-862a-293ceb4321c9","Type":"ContainerDied","Data":"968e33b2444dc2fd2372115521e881033eec0e3a711e19864ec18295c139967d"} Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.545555 5008 generic.go:334] "Generic (PLEG): container finished" podID="278822ff-d2bf-46ef-af05-b6e32a132844" containerID="4bbfce20f22776a4336e0044b343b4879c838bf84613c02e4dd01452046b207a" exitCode=0 Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.545596 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xqtk6" event={"ID":"278822ff-d2bf-46ef-af05-b6e32a132844","Type":"ContainerDied","Data":"4bbfce20f22776a4336e0044b343b4879c838bf84613c02e4dd01452046b207a"} Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.545638 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xqtk6" event={"ID":"278822ff-d2bf-46ef-af05-b6e32a132844","Type":"ContainerDied","Data":"065b83a5611fc9caa8e1c26065c5b17c694188c2a631ad5235b8b55665aff6f7"} Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.545709 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xqtk6" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.591508 5008 scope.go:117] "RemoveContainer" containerID="0d7593ba68a326818344c18f053c68dc33e7cf72fc202f0355715f30dd27a50b" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.600674 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gd9vw"] Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.612763 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xqtk6"] Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.616606 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xqtk6"] Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.624454 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rwq8l"] Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.627160 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rwq8l"] Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.640663 5008 scope.go:117] "RemoveContainer" containerID="5867cd062239b44c8db73c756517bff1042095d36f8205a1004312ea33092e95" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.642647 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5t8g6"] Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.646214 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5t8g6"] Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.682233 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pbtdl"] Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.685574 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pbtdl"] Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.688031 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7zbl"] Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.694284 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7zbl"] Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.718387 5008 scope.go:117] "RemoveContainer" containerID="746688ce78192645402d2b74b01143ca831968ad708da95fcba266b0a9d059b4" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.720378 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"746688ce78192645402d2b74b01143ca831968ad708da95fcba266b0a9d059b4\": container with ID starting with 746688ce78192645402d2b74b01143ca831968ad708da95fcba266b0a9d059b4 not found: ID does not exist" containerID="746688ce78192645402d2b74b01143ca831968ad708da95fcba266b0a9d059b4" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.720410 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"746688ce78192645402d2b74b01143ca831968ad708da95fcba266b0a9d059b4"} err="failed to get container status \"746688ce78192645402d2b74b01143ca831968ad708da95fcba266b0a9d059b4\": rpc error: code = NotFound desc = could not find container \"746688ce78192645402d2b74b01143ca831968ad708da95fcba266b0a9d059b4\": container with ID starting with 746688ce78192645402d2b74b01143ca831968ad708da95fcba266b0a9d059b4 not found: ID does not exist" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.720436 5008 scope.go:117] "RemoveContainer" containerID="0d7593ba68a326818344c18f053c68dc33e7cf72fc202f0355715f30dd27a50b" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.720640 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d7593ba68a326818344c18f053c68dc33e7cf72fc202f0355715f30dd27a50b\": container with ID starting with 0d7593ba68a326818344c18f053c68dc33e7cf72fc202f0355715f30dd27a50b not found: ID does not exist" containerID="0d7593ba68a326818344c18f053c68dc33e7cf72fc202f0355715f30dd27a50b" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.720663 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7593ba68a326818344c18f053c68dc33e7cf72fc202f0355715f30dd27a50b"} err="failed to get container status \"0d7593ba68a326818344c18f053c68dc33e7cf72fc202f0355715f30dd27a50b\": rpc error: code = NotFound desc = could not find container \"0d7593ba68a326818344c18f053c68dc33e7cf72fc202f0355715f30dd27a50b\": container with ID starting with 0d7593ba68a326818344c18f053c68dc33e7cf72fc202f0355715f30dd27a50b not found: ID does not exist" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.720678 5008 scope.go:117] "RemoveContainer" containerID="5867cd062239b44c8db73c756517bff1042095d36f8205a1004312ea33092e95" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.720899 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5867cd062239b44c8db73c756517bff1042095d36f8205a1004312ea33092e95\": container with ID starting with 5867cd062239b44c8db73c756517bff1042095d36f8205a1004312ea33092e95 not found: ID does not exist" containerID="5867cd062239b44c8db73c756517bff1042095d36f8205a1004312ea33092e95" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.720923 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5867cd062239b44c8db73c756517bff1042095d36f8205a1004312ea33092e95"} err="failed to get container status \"5867cd062239b44c8db73c756517bff1042095d36f8205a1004312ea33092e95\": rpc error: code = NotFound desc = could not find container \"5867cd062239b44c8db73c756517bff1042095d36f8205a1004312ea33092e95\": container with ID starting with 5867cd062239b44c8db73c756517bff1042095d36f8205a1004312ea33092e95 not found: ID does not exist" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.720936 5008 scope.go:117] "RemoveContainer" containerID="c8df45a267dc1bcbf7bf592bb2f4358771c356d549006d2d30985fa9e3171238" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.749497 5008 scope.go:117] "RemoveContainer" containerID="c8df45a267dc1bcbf7bf592bb2f4358771c356d549006d2d30985fa9e3171238" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.750074 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8df45a267dc1bcbf7bf592bb2f4358771c356d549006d2d30985fa9e3171238\": container with ID starting with c8df45a267dc1bcbf7bf592bb2f4358771c356d549006d2d30985fa9e3171238 not found: ID does not exist" containerID="c8df45a267dc1bcbf7bf592bb2f4358771c356d549006d2d30985fa9e3171238" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.750119 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8df45a267dc1bcbf7bf592bb2f4358771c356d549006d2d30985fa9e3171238"} err="failed to get container status \"c8df45a267dc1bcbf7bf592bb2f4358771c356d549006d2d30985fa9e3171238\": rpc error: code = NotFound desc = could not find container \"c8df45a267dc1bcbf7bf592bb2f4358771c356d549006d2d30985fa9e3171238\": container with ID starting with c8df45a267dc1bcbf7bf592bb2f4358771c356d549006d2d30985fa9e3171238 not found: ID does not exist" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.750142 5008 scope.go:117] "RemoveContainer" containerID="b404bc6877a34491a742a8bc95bfa59028e1dd3706cb64c13a03e3dd0c441f6a" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.760807 5008 scope.go:117] "RemoveContainer" containerID="d3a166d6e8a93bc3a2b49f47f4e490515beed559ebf35bce98d60b484aae60d5" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.773234 5008 scope.go:117] "RemoveContainer" containerID="6f70281f701131d2e0d3fbac329428cd7f3d336a3952d2e0bb0845b5111ef1be" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.785358 5008 scope.go:117] "RemoveContainer" containerID="b404bc6877a34491a742a8bc95bfa59028e1dd3706cb64c13a03e3dd0c441f6a" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.785745 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b404bc6877a34491a742a8bc95bfa59028e1dd3706cb64c13a03e3dd0c441f6a\": container with ID starting with b404bc6877a34491a742a8bc95bfa59028e1dd3706cb64c13a03e3dd0c441f6a not found: ID does not exist" containerID="b404bc6877a34491a742a8bc95bfa59028e1dd3706cb64c13a03e3dd0c441f6a" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.785792 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b404bc6877a34491a742a8bc95bfa59028e1dd3706cb64c13a03e3dd0c441f6a"} err="failed to get container status \"b404bc6877a34491a742a8bc95bfa59028e1dd3706cb64c13a03e3dd0c441f6a\": rpc error: code = NotFound desc = could not find container \"b404bc6877a34491a742a8bc95bfa59028e1dd3706cb64c13a03e3dd0c441f6a\": container with ID starting with b404bc6877a34491a742a8bc95bfa59028e1dd3706cb64c13a03e3dd0c441f6a not found: ID does not exist" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.785815 5008 scope.go:117] "RemoveContainer" containerID="d3a166d6e8a93bc3a2b49f47f4e490515beed559ebf35bce98d60b484aae60d5" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.786132 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3a166d6e8a93bc3a2b49f47f4e490515beed559ebf35bce98d60b484aae60d5\": container with ID starting with d3a166d6e8a93bc3a2b49f47f4e490515beed559ebf35bce98d60b484aae60d5 not found: ID does not exist" containerID="d3a166d6e8a93bc3a2b49f47f4e490515beed559ebf35bce98d60b484aae60d5" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.786153 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a166d6e8a93bc3a2b49f47f4e490515beed559ebf35bce98d60b484aae60d5"} err="failed to get container status \"d3a166d6e8a93bc3a2b49f47f4e490515beed559ebf35bce98d60b484aae60d5\": rpc error: code = NotFound desc = could not find container \"d3a166d6e8a93bc3a2b49f47f4e490515beed559ebf35bce98d60b484aae60d5\": container with ID starting with d3a166d6e8a93bc3a2b49f47f4e490515beed559ebf35bce98d60b484aae60d5 not found: ID does not exist" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.786167 5008 scope.go:117] "RemoveContainer" containerID="6f70281f701131d2e0d3fbac329428cd7f3d336a3952d2e0bb0845b5111ef1be" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.786467 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f70281f701131d2e0d3fbac329428cd7f3d336a3952d2e0bb0845b5111ef1be\": container with ID starting with 6f70281f701131d2e0d3fbac329428cd7f3d336a3952d2e0bb0845b5111ef1be not found: ID does not exist" containerID="6f70281f701131d2e0d3fbac329428cd7f3d336a3952d2e0bb0845b5111ef1be" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.786497 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f70281f701131d2e0d3fbac329428cd7f3d336a3952d2e0bb0845b5111ef1be"} err="failed to get container status \"6f70281f701131d2e0d3fbac329428cd7f3d336a3952d2e0bb0845b5111ef1be\": rpc error: code = NotFound desc = could not find container \"6f70281f701131d2e0d3fbac329428cd7f3d336a3952d2e0bb0845b5111ef1be\": container with ID starting with 6f70281f701131d2e0d3fbac329428cd7f3d336a3952d2e0bb0845b5111ef1be not found: ID does not exist" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.786513 5008 scope.go:117] "RemoveContainer" containerID="3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.796513 5008 scope.go:117] "RemoveContainer" containerID="b12374100ee123e18cd61cdeb710b12d4c0d8d31acd69abd86e6dbb1c483fa6c" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.810777 5008 scope.go:117] "RemoveContainer" containerID="cff212cff9ace6446904ad73ea5da4dff62454c74c37e5dc84ab174201fa326c" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.825086 5008 scope.go:117] "RemoveContainer" containerID="3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.825419 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835\": container with ID starting with 3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835 not found: ID does not exist" containerID="3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.825454 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835"} err="failed to get container status \"3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835\": rpc error: code = NotFound desc = could not find container \"3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835\": container with ID starting with 3531b26b297ab8236e48c7384068bd1f65b75cdfb83f1175e32f8330c1988835 not found: ID does not exist" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.825483 5008 scope.go:117] "RemoveContainer" containerID="b12374100ee123e18cd61cdeb710b12d4c0d8d31acd69abd86e6dbb1c483fa6c" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.825946 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b12374100ee123e18cd61cdeb710b12d4c0d8d31acd69abd86e6dbb1c483fa6c\": container with ID starting with b12374100ee123e18cd61cdeb710b12d4c0d8d31acd69abd86e6dbb1c483fa6c not found: ID does not exist" containerID="b12374100ee123e18cd61cdeb710b12d4c0d8d31acd69abd86e6dbb1c483fa6c" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.825998 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b12374100ee123e18cd61cdeb710b12d4c0d8d31acd69abd86e6dbb1c483fa6c"} err="failed to get container status \"b12374100ee123e18cd61cdeb710b12d4c0d8d31acd69abd86e6dbb1c483fa6c\": rpc error: code = NotFound desc = could not find container \"b12374100ee123e18cd61cdeb710b12d4c0d8d31acd69abd86e6dbb1c483fa6c\": container with ID starting with b12374100ee123e18cd61cdeb710b12d4c0d8d31acd69abd86e6dbb1c483fa6c not found: ID does not exist" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.826028 5008 scope.go:117] "RemoveContainer" containerID="cff212cff9ace6446904ad73ea5da4dff62454c74c37e5dc84ab174201fa326c" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.826373 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cff212cff9ace6446904ad73ea5da4dff62454c74c37e5dc84ab174201fa326c\": container with ID starting with cff212cff9ace6446904ad73ea5da4dff62454c74c37e5dc84ab174201fa326c not found: ID does not exist" containerID="cff212cff9ace6446904ad73ea5da4dff62454c74c37e5dc84ab174201fa326c" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.826400 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cff212cff9ace6446904ad73ea5da4dff62454c74c37e5dc84ab174201fa326c"} err="failed to get container status \"cff212cff9ace6446904ad73ea5da4dff62454c74c37e5dc84ab174201fa326c\": rpc error: code = NotFound desc = could not find container \"cff212cff9ace6446904ad73ea5da4dff62454c74c37e5dc84ab174201fa326c\": container with ID starting with cff212cff9ace6446904ad73ea5da4dff62454c74c37e5dc84ab174201fa326c not found: ID does not exist" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.826419 5008 scope.go:117] "RemoveContainer" containerID="4bbfce20f22776a4336e0044b343b4879c838bf84613c02e4dd01452046b207a" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.840474 5008 scope.go:117] "RemoveContainer" containerID="d53930ba9135a214717ba08646a2149522ddf6c278d319ce6d863e83f3699aa2" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.853400 5008 scope.go:117] "RemoveContainer" containerID="87e2772be03699f8637580705cde1e49c0bf01ce86e59ffac408b49371c2b135" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.865201 5008 scope.go:117] "RemoveContainer" containerID="4bbfce20f22776a4336e0044b343b4879c838bf84613c02e4dd01452046b207a" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.865558 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bbfce20f22776a4336e0044b343b4879c838bf84613c02e4dd01452046b207a\": container with ID starting with 4bbfce20f22776a4336e0044b343b4879c838bf84613c02e4dd01452046b207a not found: ID does not exist" containerID="4bbfce20f22776a4336e0044b343b4879c838bf84613c02e4dd01452046b207a" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.865598 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bbfce20f22776a4336e0044b343b4879c838bf84613c02e4dd01452046b207a"} err="failed to get container status \"4bbfce20f22776a4336e0044b343b4879c838bf84613c02e4dd01452046b207a\": rpc error: code = NotFound desc = could not find container \"4bbfce20f22776a4336e0044b343b4879c838bf84613c02e4dd01452046b207a\": container with ID starting with 4bbfce20f22776a4336e0044b343b4879c838bf84613c02e4dd01452046b207a not found: ID does not exist" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.865628 5008 scope.go:117] "RemoveContainer" containerID="d53930ba9135a214717ba08646a2149522ddf6c278d319ce6d863e83f3699aa2" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.865939 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d53930ba9135a214717ba08646a2149522ddf6c278d319ce6d863e83f3699aa2\": container with ID starting with d53930ba9135a214717ba08646a2149522ddf6c278d319ce6d863e83f3699aa2 not found: ID does not exist" containerID="d53930ba9135a214717ba08646a2149522ddf6c278d319ce6d863e83f3699aa2" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.866033 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53930ba9135a214717ba08646a2149522ddf6c278d319ce6d863e83f3699aa2"} err="failed to get container status \"d53930ba9135a214717ba08646a2149522ddf6c278d319ce6d863e83f3699aa2\": rpc error: code = NotFound desc = could not find container \"d53930ba9135a214717ba08646a2149522ddf6c278d319ce6d863e83f3699aa2\": container with ID starting with d53930ba9135a214717ba08646a2149522ddf6c278d319ce6d863e83f3699aa2 not found: ID does not exist" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.866062 5008 scope.go:117] "RemoveContainer" containerID="87e2772be03699f8637580705cde1e49c0bf01ce86e59ffac408b49371c2b135" Nov 26 22:44:29 crc kubenswrapper[5008]: E1126 22:44:29.866572 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87e2772be03699f8637580705cde1e49c0bf01ce86e59ffac408b49371c2b135\": container with ID starting with 87e2772be03699f8637580705cde1e49c0bf01ce86e59ffac408b49371c2b135 not found: ID does not exist" containerID="87e2772be03699f8637580705cde1e49c0bf01ce86e59ffac408b49371c2b135" Nov 26 22:44:29 crc kubenswrapper[5008]: I1126 22:44:29.866659 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87e2772be03699f8637580705cde1e49c0bf01ce86e59ffac408b49371c2b135"} err="failed to get container status \"87e2772be03699f8637580705cde1e49c0bf01ce86e59ffac408b49371c2b135\": rpc error: code = NotFound desc = could not find container \"87e2772be03699f8637580705cde1e49c0bf01ce86e59ffac408b49371c2b135\": container with ID starting with 87e2772be03699f8637580705cde1e49c0bf01ce86e59ffac408b49371c2b135 not found: ID does not exist" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.550876 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" event={"ID":"4f4cad47-dfc0-4c9d-9255-58c022732fb7","Type":"ContainerStarted","Data":"af4808855e1e8b7c5b1a167dd527a604b1df761d80a116740596407d5f81a490"} Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.551170 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" event={"ID":"4f4cad47-dfc0-4c9d-9255-58c022732fb7","Type":"ContainerStarted","Data":"a4988c6e988424ed72dd1e8b406ec2350b9aebf16d843a4182d6f27771436fa7"} Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.551189 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.566906 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.575299 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gd9vw" podStartSLOduration=2.575277067 podStartE2EDuration="2.575277067s" podCreationTimestamp="2025-11-26 22:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:44:30.568709667 +0000 UTC m=+345.981403689" watchObservedRunningTime="2025-11-26 22:44:30.575277067 +0000 UTC m=+345.987971079" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.985414 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sngbs"] Nov 26 22:44:30 crc kubenswrapper[5008]: E1126 22:44:30.985675 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" containerName="registry-server" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.985695 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" containerName="registry-server" Nov 26 22:44:30 crc kubenswrapper[5008]: E1126 22:44:30.985717 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" containerName="extract-content" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.985732 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" containerName="extract-content" Nov 26 22:44:30 crc kubenswrapper[5008]: E1126 22:44:30.985762 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278822ff-d2bf-46ef-af05-b6e32a132844" containerName="registry-server" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.985776 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="278822ff-d2bf-46ef-af05-b6e32a132844" containerName="registry-server" Nov 26 22:44:30 crc kubenswrapper[5008]: E1126 22:44:30.985803 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" containerName="extract-content" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.985815 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" containerName="extract-content" Nov 26 22:44:30 crc kubenswrapper[5008]: E1126 22:44:30.985832 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7c489e-32bf-4761-a529-e8ca560145ad" containerName="registry-server" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.985844 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7c489e-32bf-4761-a529-e8ca560145ad" containerName="registry-server" Nov 26 22:44:30 crc kubenswrapper[5008]: E1126 22:44:30.985861 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" containerName="registry-server" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.985873 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" containerName="registry-server" Nov 26 22:44:30 crc kubenswrapper[5008]: E1126 22:44:30.985889 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" containerName="extract-utilities" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.985900 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" containerName="extract-utilities" Nov 26 22:44:30 crc kubenswrapper[5008]: E1126 22:44:30.985918 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7c489e-32bf-4761-a529-e8ca560145ad" containerName="extract-content" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.985930 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7c489e-32bf-4761-a529-e8ca560145ad" containerName="extract-content" Nov 26 22:44:30 crc kubenswrapper[5008]: E1126 22:44:30.985945 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278822ff-d2bf-46ef-af05-b6e32a132844" containerName="extract-utilities" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.985959 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="278822ff-d2bf-46ef-af05-b6e32a132844" containerName="extract-utilities" Nov 26 22:44:30 crc kubenswrapper[5008]: E1126 22:44:30.986001 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7c489e-32bf-4761-a529-e8ca560145ad" containerName="extract-utilities" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.986013 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7c489e-32bf-4761-a529-e8ca560145ad" containerName="extract-utilities" Nov 26 22:44:30 crc kubenswrapper[5008]: E1126 22:44:30.986026 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97073e75-5bda-4dc4-8d80-bf408068aaef" containerName="marketplace-operator" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.986037 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="97073e75-5bda-4dc4-8d80-bf408068aaef" containerName="marketplace-operator" Nov 26 22:44:30 crc kubenswrapper[5008]: E1126 22:44:30.986054 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" containerName="extract-utilities" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.986066 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" containerName="extract-utilities" Nov 26 22:44:30 crc kubenswrapper[5008]: E1126 22:44:30.986083 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278822ff-d2bf-46ef-af05-b6e32a132844" containerName="extract-content" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.986094 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="278822ff-d2bf-46ef-af05-b6e32a132844" containerName="extract-content" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.986959 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" containerName="registry-server" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.987064 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f7c489e-32bf-4761-a529-e8ca560145ad" containerName="registry-server" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.987083 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" containerName="registry-server" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.987100 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="278822ff-d2bf-46ef-af05-b6e32a132844" containerName="registry-server" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.987116 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="97073e75-5bda-4dc4-8d80-bf408068aaef" containerName="marketplace-operator" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.988259 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:30 crc kubenswrapper[5008]: I1126 22:44:30.991059 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:30.999955 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sngbs"] Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.025042 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88d8526e-e6a9-46a5-899a-f3f2bab53083-catalog-content\") pod \"certified-operators-sngbs\" (UID: \"88d8526e-e6a9-46a5-899a-f3f2bab53083\") " pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.025490 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dskpf\" (UniqueName: \"kubernetes.io/projected/88d8526e-e6a9-46a5-899a-f3f2bab53083-kube-api-access-dskpf\") pod \"certified-operators-sngbs\" (UID: \"88d8526e-e6a9-46a5-899a-f3f2bab53083\") " pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.025702 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88d8526e-e6a9-46a5-899a-f3f2bab53083-utilities\") pod \"certified-operators-sngbs\" (UID: \"88d8526e-e6a9-46a5-899a-f3f2bab53083\") " pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.127016 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88d8526e-e6a9-46a5-899a-f3f2bab53083-catalog-content\") pod \"certified-operators-sngbs\" (UID: \"88d8526e-e6a9-46a5-899a-f3f2bab53083\") " pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.127481 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88d8526e-e6a9-46a5-899a-f3f2bab53083-catalog-content\") pod \"certified-operators-sngbs\" (UID: \"88d8526e-e6a9-46a5-899a-f3f2bab53083\") " pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.127645 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dskpf\" (UniqueName: \"kubernetes.io/projected/88d8526e-e6a9-46a5-899a-f3f2bab53083-kube-api-access-dskpf\") pod \"certified-operators-sngbs\" (UID: \"88d8526e-e6a9-46a5-899a-f3f2bab53083\") " pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.127673 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88d8526e-e6a9-46a5-899a-f3f2bab53083-utilities\") pod \"certified-operators-sngbs\" (UID: \"88d8526e-e6a9-46a5-899a-f3f2bab53083\") " pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.128189 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88d8526e-e6a9-46a5-899a-f3f2bab53083-utilities\") pod \"certified-operators-sngbs\" (UID: \"88d8526e-e6a9-46a5-899a-f3f2bab53083\") " pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.154909 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dskpf\" (UniqueName: \"kubernetes.io/projected/88d8526e-e6a9-46a5-899a-f3f2bab53083-kube-api-access-dskpf\") pod \"certified-operators-sngbs\" (UID: \"88d8526e-e6a9-46a5-899a-f3f2bab53083\") " pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.193948 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v2vb4"] Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.196614 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.198686 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v2vb4"] Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.199747 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.229062 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbbwd\" (UniqueName: \"kubernetes.io/projected/145e3925-8c1b-4bf1-8a9a-527f3afa8bb8-kube-api-access-gbbwd\") pod \"redhat-marketplace-v2vb4\" (UID: \"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8\") " pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.229177 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/145e3925-8c1b-4bf1-8a9a-527f3afa8bb8-catalog-content\") pod \"redhat-marketplace-v2vb4\" (UID: \"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8\") " pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.229222 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/145e3925-8c1b-4bf1-8a9a-527f3afa8bb8-utilities\") pod \"redhat-marketplace-v2vb4\" (UID: \"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8\") " pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.317554 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.329996 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/145e3925-8c1b-4bf1-8a9a-527f3afa8bb8-utilities\") pod \"redhat-marketplace-v2vb4\" (UID: \"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8\") " pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.330154 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbbwd\" (UniqueName: \"kubernetes.io/projected/145e3925-8c1b-4bf1-8a9a-527f3afa8bb8-kube-api-access-gbbwd\") pod \"redhat-marketplace-v2vb4\" (UID: \"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8\") " pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.330213 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/145e3925-8c1b-4bf1-8a9a-527f3afa8bb8-catalog-content\") pod \"redhat-marketplace-v2vb4\" (UID: \"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8\") " pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.330810 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/145e3925-8c1b-4bf1-8a9a-527f3afa8bb8-utilities\") pod \"redhat-marketplace-v2vb4\" (UID: \"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8\") " pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.330837 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/145e3925-8c1b-4bf1-8a9a-527f3afa8bb8-catalog-content\") pod \"redhat-marketplace-v2vb4\" (UID: \"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8\") " pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.358342 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbbwd\" (UniqueName: \"kubernetes.io/projected/145e3925-8c1b-4bf1-8a9a-527f3afa8bb8-kube-api-access-gbbwd\") pod \"redhat-marketplace-v2vb4\" (UID: \"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8\") " pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.511180 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sngbs"] Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.524652 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="278822ff-d2bf-46ef-af05-b6e32a132844" path="/var/lib/kubelet/pods/278822ff-d2bf-46ef-af05-b6e32a132844/volumes" Nov 26 22:44:31 crc kubenswrapper[5008]: W1126 22:44:31.525013 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88d8526e_e6a9_46a5_899a_f3f2bab53083.slice/crio-e8b41068bb1d7c0644b2d49a862b318934e0d50b701aed40afee31b486e75712 WatchSource:0}: Error finding container e8b41068bb1d7c0644b2d49a862b318934e0d50b701aed40afee31b486e75712: Status 404 returned error can't find the container with id e8b41068bb1d7c0644b2d49a862b318934e0d50b701aed40afee31b486e75712 Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.525114 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.525380 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f7c489e-32bf-4761-a529-e8ca560145ad" path="/var/lib/kubelet/pods/3f7c489e-32bf-4761-a529-e8ca560145ad/volumes" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.526024 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97073e75-5bda-4dc4-8d80-bf408068aaef" path="/var/lib/kubelet/pods/97073e75-5bda-4dc4-8d80-bf408068aaef/volumes" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.526836 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c63d5fd4-0d21-44b4-862a-293ceb4321c9" path="/var/lib/kubelet/pods/c63d5fd4-0d21-44b4-862a-293ceb4321c9/volumes" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.527631 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d66bc36d-82f1-4825-bb7d-a89eea9587e5" path="/var/lib/kubelet/pods/d66bc36d-82f1-4825-bb7d-a89eea9587e5/volumes" Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.566847 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sngbs" event={"ID":"88d8526e-e6a9-46a5-899a-f3f2bab53083","Type":"ContainerStarted","Data":"e8b41068bb1d7c0644b2d49a862b318934e0d50b701aed40afee31b486e75712"} Nov 26 22:44:31 crc kubenswrapper[5008]: I1126 22:44:31.736215 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v2vb4"] Nov 26 22:44:31 crc kubenswrapper[5008]: W1126 22:44:31.742836 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod145e3925_8c1b_4bf1_8a9a_527f3afa8bb8.slice/crio-240d49094e9360e77526ede6b625141a5608258defc5bff404424cff62221145 WatchSource:0}: Error finding container 240d49094e9360e77526ede6b625141a5608258defc5bff404424cff62221145: Status 404 returned error can't find the container with id 240d49094e9360e77526ede6b625141a5608258defc5bff404424cff62221145 Nov 26 22:44:32 crc kubenswrapper[5008]: I1126 22:44:32.575892 5008 generic.go:334] "Generic (PLEG): container finished" podID="145e3925-8c1b-4bf1-8a9a-527f3afa8bb8" containerID="cc33140cb78e6c623e22cc11dd48715ef30ca29e48ba8d11c1a6abd52ff0b67f" exitCode=0 Nov 26 22:44:32 crc kubenswrapper[5008]: I1126 22:44:32.575944 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2vb4" event={"ID":"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8","Type":"ContainerDied","Data":"cc33140cb78e6c623e22cc11dd48715ef30ca29e48ba8d11c1a6abd52ff0b67f"} Nov 26 22:44:32 crc kubenswrapper[5008]: I1126 22:44:32.576012 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2vb4" event={"ID":"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8","Type":"ContainerStarted","Data":"240d49094e9360e77526ede6b625141a5608258defc5bff404424cff62221145"} Nov 26 22:44:32 crc kubenswrapper[5008]: I1126 22:44:32.577566 5008 generic.go:334] "Generic (PLEG): container finished" podID="88d8526e-e6a9-46a5-899a-f3f2bab53083" containerID="78d7f26068f435b7b195a364dfe2bf345629005bafd01e57ac22f72b13ec6376" exitCode=0 Nov 26 22:44:32 crc kubenswrapper[5008]: I1126 22:44:32.578268 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sngbs" event={"ID":"88d8526e-e6a9-46a5-899a-f3f2bab53083","Type":"ContainerDied","Data":"78d7f26068f435b7b195a364dfe2bf345629005bafd01e57ac22f72b13ec6376"} Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.390592 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zsc96"] Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.392078 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.396708 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.403455 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zsc96"] Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.457907 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-utilities\") pod \"community-operators-zsc96\" (UID: \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\") " pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.458021 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75qbh\" (UniqueName: \"kubernetes.io/projected/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-kube-api-access-75qbh\") pod \"community-operators-zsc96\" (UID: \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\") " pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.458066 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-catalog-content\") pod \"community-operators-zsc96\" (UID: \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\") " pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.561951 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-catalog-content\") pod \"community-operators-zsc96\" (UID: \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\") " pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.562038 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-utilities\") pod \"community-operators-zsc96\" (UID: \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\") " pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.562118 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75qbh\" (UniqueName: \"kubernetes.io/projected/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-kube-api-access-75qbh\") pod \"community-operators-zsc96\" (UID: \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\") " pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.562675 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-catalog-content\") pod \"community-operators-zsc96\" (UID: \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\") " pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.563616 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-utilities\") pod \"community-operators-zsc96\" (UID: \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\") " pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.590223 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vxsnp"] Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.591838 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.594702 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75qbh\" (UniqueName: \"kubernetes.io/projected/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-kube-api-access-75qbh\") pod \"community-operators-zsc96\" (UID: \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\") " pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.599071 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.601374 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vxsnp"] Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.604319 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sngbs" event={"ID":"88d8526e-e6a9-46a5-899a-f3f2bab53083","Type":"ContainerStarted","Data":"79a79ccdf56cd265e11eff702b9f5bd0d914e2b2322aa73e0a31bf5ed3cd9a3b"} Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.615128 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2vb4" event={"ID":"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8","Type":"ContainerStarted","Data":"ad9c9976976927017f94a42dfc98720c1b0278174485d74e232ca471c259a3e1"} Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.663109 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d6a7d0-4e7c-4666-b3af-285f532684a8-catalog-content\") pod \"redhat-operators-vxsnp\" (UID: \"00d6a7d0-4e7c-4666-b3af-285f532684a8\") " pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.663156 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d6a7d0-4e7c-4666-b3af-285f532684a8-utilities\") pod \"redhat-operators-vxsnp\" (UID: \"00d6a7d0-4e7c-4666-b3af-285f532684a8\") " pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.663208 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkcxf\" (UniqueName: \"kubernetes.io/projected/00d6a7d0-4e7c-4666-b3af-285f532684a8-kube-api-access-qkcxf\") pod \"redhat-operators-vxsnp\" (UID: \"00d6a7d0-4e7c-4666-b3af-285f532684a8\") " pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.752422 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.764337 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d6a7d0-4e7c-4666-b3af-285f532684a8-catalog-content\") pod \"redhat-operators-vxsnp\" (UID: \"00d6a7d0-4e7c-4666-b3af-285f532684a8\") " pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.764415 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d6a7d0-4e7c-4666-b3af-285f532684a8-utilities\") pod \"redhat-operators-vxsnp\" (UID: \"00d6a7d0-4e7c-4666-b3af-285f532684a8\") " pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.764492 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkcxf\" (UniqueName: \"kubernetes.io/projected/00d6a7d0-4e7c-4666-b3af-285f532684a8-kube-api-access-qkcxf\") pod \"redhat-operators-vxsnp\" (UID: \"00d6a7d0-4e7c-4666-b3af-285f532684a8\") " pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.764865 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d6a7d0-4e7c-4666-b3af-285f532684a8-utilities\") pod \"redhat-operators-vxsnp\" (UID: \"00d6a7d0-4e7c-4666-b3af-285f532684a8\") " pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.765526 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d6a7d0-4e7c-4666-b3af-285f532684a8-catalog-content\") pod \"redhat-operators-vxsnp\" (UID: \"00d6a7d0-4e7c-4666-b3af-285f532684a8\") " pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.783474 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkcxf\" (UniqueName: \"kubernetes.io/projected/00d6a7d0-4e7c-4666-b3af-285f532684a8-kube-api-access-qkcxf\") pod \"redhat-operators-vxsnp\" (UID: \"00d6a7d0-4e7c-4666-b3af-285f532684a8\") " pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:33 crc kubenswrapper[5008]: I1126 22:44:33.930407 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zsc96"] Nov 26 22:44:33 crc kubenswrapper[5008]: W1126 22:44:33.938319 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf763e0c_bae0_4cd8_abcc_980ee5edd7ab.slice/crio-933d3ea1bc273567625a99c48172fda773100032d774cc1a3dfb9a95e9f33389 WatchSource:0}: Error finding container 933d3ea1bc273567625a99c48172fda773100032d774cc1a3dfb9a95e9f33389: Status 404 returned error can't find the container with id 933d3ea1bc273567625a99c48172fda773100032d774cc1a3dfb9a95e9f33389 Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.002245 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.179623 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vxsnp"] Nov 26 22:44:34 crc kubenswrapper[5008]: W1126 22:44:34.190498 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00d6a7d0_4e7c_4666_b3af_285f532684a8.slice/crio-71cfd4cbec8bf72eb1ffade1b4d850690e4514259fb41241f9ecc0702860b7f3 WatchSource:0}: Error finding container 71cfd4cbec8bf72eb1ffade1b4d850690e4514259fb41241f9ecc0702860b7f3: Status 404 returned error can't find the container with id 71cfd4cbec8bf72eb1ffade1b4d850690e4514259fb41241f9ecc0702860b7f3 Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.626910 5008 generic.go:334] "Generic (PLEG): container finished" podID="145e3925-8c1b-4bf1-8a9a-527f3afa8bb8" containerID="ad9c9976976927017f94a42dfc98720c1b0278174485d74e232ca471c259a3e1" exitCode=0 Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.626985 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2vb4" event={"ID":"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8","Type":"ContainerDied","Data":"ad9c9976976927017f94a42dfc98720c1b0278174485d74e232ca471c259a3e1"} Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.627488 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2vb4" event={"ID":"145e3925-8c1b-4bf1-8a9a-527f3afa8bb8","Type":"ContainerStarted","Data":"5c29ad83ab4c87aa57bdffaca97c0925b69b14ee2f330ec9cc89d5b13b59b0bc"} Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.629742 5008 generic.go:334] "Generic (PLEG): container finished" podID="88d8526e-e6a9-46a5-899a-f3f2bab53083" containerID="79a79ccdf56cd265e11eff702b9f5bd0d914e2b2322aa73e0a31bf5ed3cd9a3b" exitCode=0 Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.629825 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sngbs" event={"ID":"88d8526e-e6a9-46a5-899a-f3f2bab53083","Type":"ContainerDied","Data":"79a79ccdf56cd265e11eff702b9f5bd0d914e2b2322aa73e0a31bf5ed3cd9a3b"} Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.632308 5008 generic.go:334] "Generic (PLEG): container finished" podID="af763e0c-bae0-4cd8-abcc-980ee5edd7ab" containerID="5ff97b82db43d48d99c4dfe578691ec81f3b5897b875c8b97e63fa6a50becdcb" exitCode=0 Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.632403 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsc96" event={"ID":"af763e0c-bae0-4cd8-abcc-980ee5edd7ab","Type":"ContainerDied","Data":"5ff97b82db43d48d99c4dfe578691ec81f3b5897b875c8b97e63fa6a50becdcb"} Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.632551 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsc96" event={"ID":"af763e0c-bae0-4cd8-abcc-980ee5edd7ab","Type":"ContainerStarted","Data":"933d3ea1bc273567625a99c48172fda773100032d774cc1a3dfb9a95e9f33389"} Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.637749 5008 generic.go:334] "Generic (PLEG): container finished" podID="00d6a7d0-4e7c-4666-b3af-285f532684a8" containerID="16e2e4e4b960651f215819dbd1741bd0a496767910676dd80bf3720e96ef12a3" exitCode=0 Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.637799 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxsnp" event={"ID":"00d6a7d0-4e7c-4666-b3af-285f532684a8","Type":"ContainerDied","Data":"16e2e4e4b960651f215819dbd1741bd0a496767910676dd80bf3720e96ef12a3"} Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.637833 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxsnp" event={"ID":"00d6a7d0-4e7c-4666-b3af-285f532684a8","Type":"ContainerStarted","Data":"71cfd4cbec8bf72eb1ffade1b4d850690e4514259fb41241f9ecc0702860b7f3"} Nov 26 22:44:34 crc kubenswrapper[5008]: I1126 22:44:34.655223 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v2vb4" podStartSLOduration=2.206264525 podStartE2EDuration="3.655204317s" podCreationTimestamp="2025-11-26 22:44:31 +0000 UTC" firstStartedPulling="2025-11-26 22:44:32.578320651 +0000 UTC m=+347.991014653" lastFinishedPulling="2025-11-26 22:44:34.027260443 +0000 UTC m=+349.439954445" observedRunningTime="2025-11-26 22:44:34.653228854 +0000 UTC m=+350.065922886" watchObservedRunningTime="2025-11-26 22:44:34.655204317 +0000 UTC m=+350.067898319" Nov 26 22:44:35 crc kubenswrapper[5008]: I1126 22:44:35.647020 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsc96" event={"ID":"af763e0c-bae0-4cd8-abcc-980ee5edd7ab","Type":"ContainerStarted","Data":"87d8d251488bc8ae419207315781c81159ec4469ecc29eff55387d9a6854d806"} Nov 26 22:44:35 crc kubenswrapper[5008]: I1126 22:44:35.652663 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sngbs" event={"ID":"88d8526e-e6a9-46a5-899a-f3f2bab53083","Type":"ContainerStarted","Data":"219a752fb78a6793a80c4de7767c27a5aae0ac63ec2a7db28f39a587180d2d91"} Nov 26 22:44:35 crc kubenswrapper[5008]: I1126 22:44:35.654928 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxsnp" event={"ID":"00d6a7d0-4e7c-4666-b3af-285f532684a8","Type":"ContainerStarted","Data":"9073830d25f659a71b4e4ee1946a5a5b3d034db7ae98f0aa93990e2314c39c49"} Nov 26 22:44:35 crc kubenswrapper[5008]: I1126 22:44:35.709375 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sngbs" podStartSLOduration=3.241150747 podStartE2EDuration="5.709355321s" podCreationTimestamp="2025-11-26 22:44:30 +0000 UTC" firstStartedPulling="2025-11-26 22:44:32.579676515 +0000 UTC m=+347.992370517" lastFinishedPulling="2025-11-26 22:44:35.047881089 +0000 UTC m=+350.460575091" observedRunningTime="2025-11-26 22:44:35.691546448 +0000 UTC m=+351.104240450" watchObservedRunningTime="2025-11-26 22:44:35.709355321 +0000 UTC m=+351.122049323" Nov 26 22:44:36 crc kubenswrapper[5008]: I1126 22:44:36.679174 5008 generic.go:334] "Generic (PLEG): container finished" podID="af763e0c-bae0-4cd8-abcc-980ee5edd7ab" containerID="87d8d251488bc8ae419207315781c81159ec4469ecc29eff55387d9a6854d806" exitCode=0 Nov 26 22:44:36 crc kubenswrapper[5008]: I1126 22:44:36.679797 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsc96" event={"ID":"af763e0c-bae0-4cd8-abcc-980ee5edd7ab","Type":"ContainerDied","Data":"87d8d251488bc8ae419207315781c81159ec4469ecc29eff55387d9a6854d806"} Nov 26 22:44:36 crc kubenswrapper[5008]: I1126 22:44:36.694953 5008 generic.go:334] "Generic (PLEG): container finished" podID="00d6a7d0-4e7c-4666-b3af-285f532684a8" containerID="9073830d25f659a71b4e4ee1946a5a5b3d034db7ae98f0aa93990e2314c39c49" exitCode=0 Nov 26 22:44:36 crc kubenswrapper[5008]: I1126 22:44:36.696446 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxsnp" event={"ID":"00d6a7d0-4e7c-4666-b3af-285f532684a8","Type":"ContainerDied","Data":"9073830d25f659a71b4e4ee1946a5a5b3d034db7ae98f0aa93990e2314c39c49"} Nov 26 22:44:37 crc kubenswrapper[5008]: I1126 22:44:37.704033 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsc96" event={"ID":"af763e0c-bae0-4cd8-abcc-980ee5edd7ab","Type":"ContainerStarted","Data":"d4370799d409ca2891abe3bc35cc275e53298ca04124beba8626e0fde1bcbc64"} Nov 26 22:44:37 crc kubenswrapper[5008]: I1126 22:44:37.707000 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxsnp" event={"ID":"00d6a7d0-4e7c-4666-b3af-285f532684a8","Type":"ContainerStarted","Data":"a2038bfbb20d593e550f9f7b67540f39aa6f0fa39baea39c03c4afc03bccee84"} Nov 26 22:44:37 crc kubenswrapper[5008]: I1126 22:44:37.723721 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zsc96" podStartSLOduration=2.275242457 podStartE2EDuration="4.723696146s" podCreationTimestamp="2025-11-26 22:44:33 +0000 UTC" firstStartedPulling="2025-11-26 22:44:34.636861018 +0000 UTC m=+350.049555030" lastFinishedPulling="2025-11-26 22:44:37.085314687 +0000 UTC m=+352.498008719" observedRunningTime="2025-11-26 22:44:37.721129214 +0000 UTC m=+353.133823266" watchObservedRunningTime="2025-11-26 22:44:37.723696146 +0000 UTC m=+353.136390178" Nov 26 22:44:41 crc kubenswrapper[5008]: I1126 22:44:41.318143 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:41 crc kubenswrapper[5008]: I1126 22:44:41.318453 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:41 crc kubenswrapper[5008]: I1126 22:44:41.368944 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:41 crc kubenswrapper[5008]: I1126 22:44:41.389817 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vxsnp" podStartSLOduration=5.874018641 podStartE2EDuration="8.389798555s" podCreationTimestamp="2025-11-26 22:44:33 +0000 UTC" firstStartedPulling="2025-11-26 22:44:34.640737482 +0000 UTC m=+350.053431504" lastFinishedPulling="2025-11-26 22:44:37.156517406 +0000 UTC m=+352.569211418" observedRunningTime="2025-11-26 22:44:37.744881438 +0000 UTC m=+353.157575460" watchObservedRunningTime="2025-11-26 22:44:41.389798555 +0000 UTC m=+356.802492557" Nov 26 22:44:41 crc kubenswrapper[5008]: I1126 22:44:41.526218 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:41 crc kubenswrapper[5008]: I1126 22:44:41.526286 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:41 crc kubenswrapper[5008]: I1126 22:44:41.560597 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:41 crc kubenswrapper[5008]: I1126 22:44:41.778445 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v2vb4" Nov 26 22:44:41 crc kubenswrapper[5008]: I1126 22:44:41.779733 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sngbs" Nov 26 22:44:43 crc kubenswrapper[5008]: I1126 22:44:43.753441 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:43 crc kubenswrapper[5008]: I1126 22:44:43.753684 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:43 crc kubenswrapper[5008]: I1126 22:44:43.812352 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:44 crc kubenswrapper[5008]: I1126 22:44:44.002411 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:44 crc kubenswrapper[5008]: I1126 22:44:44.002651 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:44 crc kubenswrapper[5008]: I1126 22:44:44.047868 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:44 crc kubenswrapper[5008]: I1126 22:44:44.786374 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:44:44 crc kubenswrapper[5008]: I1126 22:44:44.786819 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vxsnp" Nov 26 22:44:59 crc kubenswrapper[5008]: I1126 22:44:59.281379 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:44:59 crc kubenswrapper[5008]: I1126 22:44:59.282095 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.159318 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd"] Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.160699 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.162996 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd"] Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.166241 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.166685 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.306378 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0df09ef-8f5c-4921-a299-4fa93bcc068b-config-volume\") pod \"collect-profiles-29403285-cpjdd\" (UID: \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.306513 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d42b\" (UniqueName: \"kubernetes.io/projected/e0df09ef-8f5c-4921-a299-4fa93bcc068b-kube-api-access-7d42b\") pod \"collect-profiles-29403285-cpjdd\" (UID: \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.306544 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0df09ef-8f5c-4921-a299-4fa93bcc068b-secret-volume\") pod \"collect-profiles-29403285-cpjdd\" (UID: \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.408114 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0df09ef-8f5c-4921-a299-4fa93bcc068b-config-volume\") pod \"collect-profiles-29403285-cpjdd\" (UID: \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.408227 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d42b\" (UniqueName: \"kubernetes.io/projected/e0df09ef-8f5c-4921-a299-4fa93bcc068b-kube-api-access-7d42b\") pod \"collect-profiles-29403285-cpjdd\" (UID: \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.408290 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0df09ef-8f5c-4921-a299-4fa93bcc068b-secret-volume\") pod \"collect-profiles-29403285-cpjdd\" (UID: \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.409714 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0df09ef-8f5c-4921-a299-4fa93bcc068b-config-volume\") pod \"collect-profiles-29403285-cpjdd\" (UID: \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.417524 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0df09ef-8f5c-4921-a299-4fa93bcc068b-secret-volume\") pod \"collect-profiles-29403285-cpjdd\" (UID: \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.437954 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d42b\" (UniqueName: \"kubernetes.io/projected/e0df09ef-8f5c-4921-a299-4fa93bcc068b-kube-api-access-7d42b\") pod \"collect-profiles-29403285-cpjdd\" (UID: \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.484097 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.707681 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd"] Nov 26 22:45:00 crc kubenswrapper[5008]: I1126 22:45:00.848522 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" event={"ID":"e0df09ef-8f5c-4921-a299-4fa93bcc068b","Type":"ContainerStarted","Data":"005c2a21e209435f88c5202b041e55723ccb1516cbe5b12e0a4a52b18c33a066"} Nov 26 22:45:01 crc kubenswrapper[5008]: I1126 22:45:01.856399 5008 generic.go:334] "Generic (PLEG): container finished" podID="e0df09ef-8f5c-4921-a299-4fa93bcc068b" containerID="a206af4fb8a822313610f8b2997f2e421b0c8b8e96ecebc3c5f958d0c66702c4" exitCode=0 Nov 26 22:45:01 crc kubenswrapper[5008]: I1126 22:45:01.856520 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" event={"ID":"e0df09ef-8f5c-4921-a299-4fa93bcc068b","Type":"ContainerDied","Data":"a206af4fb8a822313610f8b2997f2e421b0c8b8e96ecebc3c5f958d0c66702c4"} Nov 26 22:45:03 crc kubenswrapper[5008]: I1126 22:45:03.179731 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" Nov 26 22:45:03 crc kubenswrapper[5008]: I1126 22:45:03.348349 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d42b\" (UniqueName: \"kubernetes.io/projected/e0df09ef-8f5c-4921-a299-4fa93bcc068b-kube-api-access-7d42b\") pod \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\" (UID: \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\") " Nov 26 22:45:03 crc kubenswrapper[5008]: I1126 22:45:03.348577 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0df09ef-8f5c-4921-a299-4fa93bcc068b-config-volume\") pod \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\" (UID: \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\") " Nov 26 22:45:03 crc kubenswrapper[5008]: I1126 22:45:03.348676 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0df09ef-8f5c-4921-a299-4fa93bcc068b-secret-volume\") pod \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\" (UID: \"e0df09ef-8f5c-4921-a299-4fa93bcc068b\") " Nov 26 22:45:03 crc kubenswrapper[5008]: I1126 22:45:03.349516 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0df09ef-8f5c-4921-a299-4fa93bcc068b-config-volume" (OuterVolumeSpecName: "config-volume") pod "e0df09ef-8f5c-4921-a299-4fa93bcc068b" (UID: "e0df09ef-8f5c-4921-a299-4fa93bcc068b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:45:03 crc kubenswrapper[5008]: I1126 22:45:03.357540 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0df09ef-8f5c-4921-a299-4fa93bcc068b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e0df09ef-8f5c-4921-a299-4fa93bcc068b" (UID: "e0df09ef-8f5c-4921-a299-4fa93bcc068b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:45:03 crc kubenswrapper[5008]: I1126 22:45:03.357902 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0df09ef-8f5c-4921-a299-4fa93bcc068b-kube-api-access-7d42b" (OuterVolumeSpecName: "kube-api-access-7d42b") pod "e0df09ef-8f5c-4921-a299-4fa93bcc068b" (UID: "e0df09ef-8f5c-4921-a299-4fa93bcc068b"). InnerVolumeSpecName "kube-api-access-7d42b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:45:03 crc kubenswrapper[5008]: I1126 22:45:03.450309 5008 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0df09ef-8f5c-4921-a299-4fa93bcc068b-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 22:45:03 crc kubenswrapper[5008]: I1126 22:45:03.450401 5008 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0df09ef-8f5c-4921-a299-4fa93bcc068b-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 22:45:03 crc kubenswrapper[5008]: I1126 22:45:03.450426 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d42b\" (UniqueName: \"kubernetes.io/projected/e0df09ef-8f5c-4921-a299-4fa93bcc068b-kube-api-access-7d42b\") on node \"crc\" DevicePath \"\"" Nov 26 22:45:03 crc kubenswrapper[5008]: I1126 22:45:03.872750 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" event={"ID":"e0df09ef-8f5c-4921-a299-4fa93bcc068b","Type":"ContainerDied","Data":"005c2a21e209435f88c5202b041e55723ccb1516cbe5b12e0a4a52b18c33a066"} Nov 26 22:45:03 crc kubenswrapper[5008]: I1126 22:45:03.872833 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29403285-cpjdd" Nov 26 22:45:03 crc kubenswrapper[5008]: I1126 22:45:03.872856 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="005c2a21e209435f88c5202b041e55723ccb1516cbe5b12e0a4a52b18c33a066" Nov 26 22:45:29 crc kubenswrapper[5008]: I1126 22:45:29.281686 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:45:29 crc kubenswrapper[5008]: I1126 22:45:29.282347 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:45:59 crc kubenswrapper[5008]: I1126 22:45:59.281308 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:45:59 crc kubenswrapper[5008]: I1126 22:45:59.281914 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:45:59 crc kubenswrapper[5008]: I1126 22:45:59.282026 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:45:59 crc kubenswrapper[5008]: I1126 22:45:59.282804 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2f4b44c1d16055de7e482db16e8660c7a628418920e436930bbd5729f4dfeb2b"} pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 22:45:59 crc kubenswrapper[5008]: I1126 22:45:59.282907 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" containerID="cri-o://2f4b44c1d16055de7e482db16e8660c7a628418920e436930bbd5729f4dfeb2b" gracePeriod=600 Nov 26 22:46:00 crc kubenswrapper[5008]: I1126 22:46:00.285896 5008 generic.go:334] "Generic (PLEG): container finished" podID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerID="2f4b44c1d16055de7e482db16e8660c7a628418920e436930bbd5729f4dfeb2b" exitCode=0 Nov 26 22:46:00 crc kubenswrapper[5008]: I1126 22:46:00.286006 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerDied","Data":"2f4b44c1d16055de7e482db16e8660c7a628418920e436930bbd5729f4dfeb2b"} Nov 26 22:46:00 crc kubenswrapper[5008]: I1126 22:46:00.286470 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerStarted","Data":"983d90254c72d506a65b1c3d88199e1f5ac7b864864378d8067f3493c3b9ada7"} Nov 26 22:46:00 crc kubenswrapper[5008]: I1126 22:46:00.286492 5008 scope.go:117] "RemoveContainer" containerID="c21ec6f8c83c2f57b20436276ae07a2d2946bafbeeb51483de08ceb2cf2ea574" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.047718 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-g7pxp"] Nov 26 22:46:57 crc kubenswrapper[5008]: E1126 22:46:57.048606 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0df09ef-8f5c-4921-a299-4fa93bcc068b" containerName="collect-profiles" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.048627 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0df09ef-8f5c-4921-a299-4fa93bcc068b" containerName="collect-profiles" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.048827 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0df09ef-8f5c-4921-a299-4fa93bcc068b" containerName="collect-profiles" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.049433 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.061155 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-g7pxp"] Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.110464 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/30409576-40a4-4829-bd97-e9eaf746ce22-ca-trust-extracted\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.110502 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/30409576-40a4-4829-bd97-e9eaf746ce22-registry-certificates\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.110554 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pshsd\" (UniqueName: \"kubernetes.io/projected/30409576-40a4-4829-bd97-e9eaf746ce22-kube-api-access-pshsd\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.110576 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/30409576-40a4-4829-bd97-e9eaf746ce22-bound-sa-token\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.110816 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.110983 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/30409576-40a4-4829-bd97-e9eaf746ce22-installation-pull-secrets\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.111015 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/30409576-40a4-4829-bd97-e9eaf746ce22-registry-tls\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.111049 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/30409576-40a4-4829-bd97-e9eaf746ce22-trusted-ca\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.145419 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.212256 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/30409576-40a4-4829-bd97-e9eaf746ce22-installation-pull-secrets\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.212319 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/30409576-40a4-4829-bd97-e9eaf746ce22-registry-tls\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.212358 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/30409576-40a4-4829-bd97-e9eaf746ce22-trusted-ca\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.212454 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/30409576-40a4-4829-bd97-e9eaf746ce22-registry-certificates\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.212488 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/30409576-40a4-4829-bd97-e9eaf746ce22-ca-trust-extracted\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.212520 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pshsd\" (UniqueName: \"kubernetes.io/projected/30409576-40a4-4829-bd97-e9eaf746ce22-kube-api-access-pshsd\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.212555 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/30409576-40a4-4829-bd97-e9eaf746ce22-bound-sa-token\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.213271 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/30409576-40a4-4829-bd97-e9eaf746ce22-ca-trust-extracted\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.214408 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/30409576-40a4-4829-bd97-e9eaf746ce22-registry-certificates\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.214795 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/30409576-40a4-4829-bd97-e9eaf746ce22-trusted-ca\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.220048 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/30409576-40a4-4829-bd97-e9eaf746ce22-installation-pull-secrets\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.220179 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/30409576-40a4-4829-bd97-e9eaf746ce22-registry-tls\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.228561 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/30409576-40a4-4829-bd97-e9eaf746ce22-bound-sa-token\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.230219 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pshsd\" (UniqueName: \"kubernetes.io/projected/30409576-40a4-4829-bd97-e9eaf746ce22-kube-api-access-pshsd\") pod \"image-registry-66df7c8f76-g7pxp\" (UID: \"30409576-40a4-4829-bd97-e9eaf746ce22\") " pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.368070 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.672267 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-g7pxp"] Nov 26 22:46:57 crc kubenswrapper[5008]: W1126 22:46:57.680254 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30409576_40a4_4829_bd97_e9eaf746ce22.slice/crio-68879ba309ad41e8090671080e5c7bea7cfc63e65bab4a9878de92a38940ce49 WatchSource:0}: Error finding container 68879ba309ad41e8090671080e5c7bea7cfc63e65bab4a9878de92a38940ce49: Status 404 returned error can't find the container with id 68879ba309ad41e8090671080e5c7bea7cfc63e65bab4a9878de92a38940ce49 Nov 26 22:46:57 crc kubenswrapper[5008]: I1126 22:46:57.700244 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" event={"ID":"30409576-40a4-4829-bd97-e9eaf746ce22","Type":"ContainerStarted","Data":"68879ba309ad41e8090671080e5c7bea7cfc63e65bab4a9878de92a38940ce49"} Nov 26 22:46:58 crc kubenswrapper[5008]: I1126 22:46:58.711291 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" event={"ID":"30409576-40a4-4829-bd97-e9eaf746ce22","Type":"ContainerStarted","Data":"a47d57ead88b89c2779370cd95ef34c388ea5087b6dd087dc206dd983d254cf6"} Nov 26 22:46:58 crc kubenswrapper[5008]: I1126 22:46:58.712959 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:46:58 crc kubenswrapper[5008]: I1126 22:46:58.740734 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" podStartSLOduration=1.740709393 podStartE2EDuration="1.740709393s" podCreationTimestamp="2025-11-26 22:46:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:46:58.738849284 +0000 UTC m=+494.151543326" watchObservedRunningTime="2025-11-26 22:46:58.740709393 +0000 UTC m=+494.153403425" Nov 26 22:47:17 crc kubenswrapper[5008]: I1126 22:47:17.383245 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-g7pxp" Nov 26 22:47:17 crc kubenswrapper[5008]: I1126 22:47:17.473926 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xf97j"] Nov 26 22:47:42 crc kubenswrapper[5008]: I1126 22:47:42.519632 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" podUID="90717c11-47b1-4265-8ea3-9c826850e812" containerName="registry" containerID="cri-o://746247554c4ebdf934c0e359d585d6bd9869ed384da89958c8c612c5f281dadf" gracePeriod=30 Nov 26 22:47:42 crc kubenswrapper[5008]: I1126 22:47:42.967805 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:47:42 crc kubenswrapper[5008]: I1126 22:47:42.995676 5008 generic.go:334] "Generic (PLEG): container finished" podID="90717c11-47b1-4265-8ea3-9c826850e812" containerID="746247554c4ebdf934c0e359d585d6bd9869ed384da89958c8c612c5f281dadf" exitCode=0 Nov 26 22:47:42 crc kubenswrapper[5008]: I1126 22:47:42.995712 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" event={"ID":"90717c11-47b1-4265-8ea3-9c826850e812","Type":"ContainerDied","Data":"746247554c4ebdf934c0e359d585d6bd9869ed384da89958c8c612c5f281dadf"} Nov 26 22:47:42 crc kubenswrapper[5008]: I1126 22:47:42.995734 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" event={"ID":"90717c11-47b1-4265-8ea3-9c826850e812","Type":"ContainerDied","Data":"65f6fc9f585e3217afad8b36c37d9225947b001f04fc6355ad8b1f19ee23f8e4"} Nov 26 22:47:42 crc kubenswrapper[5008]: I1126 22:47:42.995749 5008 scope.go:117] "RemoveContainer" containerID="746247554c4ebdf934c0e359d585d6bd9869ed384da89958c8c612c5f281dadf" Nov 26 22:47:42 crc kubenswrapper[5008]: I1126 22:47:42.995831 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xf97j" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.015406 5008 scope.go:117] "RemoveContainer" containerID="746247554c4ebdf934c0e359d585d6bd9869ed384da89958c8c612c5f281dadf" Nov 26 22:47:43 crc kubenswrapper[5008]: E1126 22:47:43.015848 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"746247554c4ebdf934c0e359d585d6bd9869ed384da89958c8c612c5f281dadf\": container with ID starting with 746247554c4ebdf934c0e359d585d6bd9869ed384da89958c8c612c5f281dadf not found: ID does not exist" containerID="746247554c4ebdf934c0e359d585d6bd9869ed384da89958c8c612c5f281dadf" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.015883 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"746247554c4ebdf934c0e359d585d6bd9869ed384da89958c8c612c5f281dadf"} err="failed to get container status \"746247554c4ebdf934c0e359d585d6bd9869ed384da89958c8c612c5f281dadf\": rpc error: code = NotFound desc = could not find container \"746247554c4ebdf934c0e359d585d6bd9869ed384da89958c8c612c5f281dadf\": container with ID starting with 746247554c4ebdf934c0e359d585d6bd9869ed384da89958c8c612c5f281dadf not found: ID does not exist" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.128361 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"90717c11-47b1-4265-8ea3-9c826850e812\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.128422 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-bound-sa-token\") pod \"90717c11-47b1-4265-8ea3-9c826850e812\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.128529 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/90717c11-47b1-4265-8ea3-9c826850e812-installation-pull-secrets\") pod \"90717c11-47b1-4265-8ea3-9c826850e812\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.128572 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/90717c11-47b1-4265-8ea3-9c826850e812-registry-certificates\") pod \"90717c11-47b1-4265-8ea3-9c826850e812\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.128641 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90717c11-47b1-4265-8ea3-9c826850e812-trusted-ca\") pod \"90717c11-47b1-4265-8ea3-9c826850e812\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.128686 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/90717c11-47b1-4265-8ea3-9c826850e812-ca-trust-extracted\") pod \"90717c11-47b1-4265-8ea3-9c826850e812\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.128740 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-registry-tls\") pod \"90717c11-47b1-4265-8ea3-9c826850e812\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.128798 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfrqw\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-kube-api-access-cfrqw\") pod \"90717c11-47b1-4265-8ea3-9c826850e812\" (UID: \"90717c11-47b1-4265-8ea3-9c826850e812\") " Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.130798 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90717c11-47b1-4265-8ea3-9c826850e812-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "90717c11-47b1-4265-8ea3-9c826850e812" (UID: "90717c11-47b1-4265-8ea3-9c826850e812"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.144355 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "90717c11-47b1-4265-8ea3-9c826850e812" (UID: "90717c11-47b1-4265-8ea3-9c826850e812"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.144430 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90717c11-47b1-4265-8ea3-9c826850e812-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "90717c11-47b1-4265-8ea3-9c826850e812" (UID: "90717c11-47b1-4265-8ea3-9c826850e812"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.145868 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-kube-api-access-cfrqw" (OuterVolumeSpecName: "kube-api-access-cfrqw") pod "90717c11-47b1-4265-8ea3-9c826850e812" (UID: "90717c11-47b1-4265-8ea3-9c826850e812"). InnerVolumeSpecName "kube-api-access-cfrqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.149340 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "90717c11-47b1-4265-8ea3-9c826850e812" (UID: "90717c11-47b1-4265-8ea3-9c826850e812"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.155492 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90717c11-47b1-4265-8ea3-9c826850e812-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "90717c11-47b1-4265-8ea3-9c826850e812" (UID: "90717c11-47b1-4265-8ea3-9c826850e812"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.161885 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "90717c11-47b1-4265-8ea3-9c826850e812" (UID: "90717c11-47b1-4265-8ea3-9c826850e812"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.177670 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90717c11-47b1-4265-8ea3-9c826850e812-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "90717c11-47b1-4265-8ea3-9c826850e812" (UID: "90717c11-47b1-4265-8ea3-9c826850e812"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.230043 5008 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.230075 5008 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/90717c11-47b1-4265-8ea3-9c826850e812-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.230089 5008 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/90717c11-47b1-4265-8ea3-9c826850e812-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.230103 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90717c11-47b1-4265-8ea3-9c826850e812-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.230114 5008 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/90717c11-47b1-4265-8ea3-9c826850e812-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.230126 5008 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.230136 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfrqw\" (UniqueName: \"kubernetes.io/projected/90717c11-47b1-4265-8ea3-9c826850e812-kube-api-access-cfrqw\") on node \"crc\" DevicePath \"\"" Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.354525 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xf97j"] Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.363541 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xf97j"] Nov 26 22:47:43 crc kubenswrapper[5008]: I1126 22:47:43.532335 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90717c11-47b1-4265-8ea3-9c826850e812" path="/var/lib/kubelet/pods/90717c11-47b1-4265-8ea3-9c826850e812/volumes" Nov 26 22:47:59 crc kubenswrapper[5008]: I1126 22:47:59.281456 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:47:59 crc kubenswrapper[5008]: I1126 22:47:59.282239 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:48:29 crc kubenswrapper[5008]: I1126 22:48:29.281384 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:48:29 crc kubenswrapper[5008]: I1126 22:48:29.282326 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:48:59 crc kubenswrapper[5008]: I1126 22:48:59.281575 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:48:59 crc kubenswrapper[5008]: I1126 22:48:59.282224 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:48:59 crc kubenswrapper[5008]: I1126 22:48:59.282293 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:48:59 crc kubenswrapper[5008]: I1126 22:48:59.283260 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"983d90254c72d506a65b1c3d88199e1f5ac7b864864378d8067f3493c3b9ada7"} pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 22:48:59 crc kubenswrapper[5008]: I1126 22:48:59.283397 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" containerID="cri-o://983d90254c72d506a65b1c3d88199e1f5ac7b864864378d8067f3493c3b9ada7" gracePeriod=600 Nov 26 22:48:59 crc kubenswrapper[5008]: I1126 22:48:59.909192 5008 generic.go:334] "Generic (PLEG): container finished" podID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerID="983d90254c72d506a65b1c3d88199e1f5ac7b864864378d8067f3493c3b9ada7" exitCode=0 Nov 26 22:48:59 crc kubenswrapper[5008]: I1126 22:48:59.909299 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerDied","Data":"983d90254c72d506a65b1c3d88199e1f5ac7b864864378d8067f3493c3b9ada7"} Nov 26 22:48:59 crc kubenswrapper[5008]: I1126 22:48:59.910067 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerStarted","Data":"52afabcce87945b7bdb95fab76560c9840f1056b4d5ad19b5a36a6c1c1f5fb46"} Nov 26 22:48:59 crc kubenswrapper[5008]: I1126 22:48:59.910112 5008 scope.go:117] "RemoveContainer" containerID="2f4b44c1d16055de7e482db16e8660c7a628418920e436930bbd5729f4dfeb2b" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.013133 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zpbmz"] Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.014313 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovn-controller" containerID="cri-o://422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634" gracePeriod=30 Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.014812 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="sbdb" containerID="cri-o://87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591" gracePeriod=30 Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.014883 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="nbdb" containerID="cri-o://83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac" gracePeriod=30 Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.014943 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="northd" containerID="cri-o://7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f" gracePeriod=30 Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.015032 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e" gracePeriod=30 Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.015087 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="kube-rbac-proxy-node" containerID="cri-o://ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523" gracePeriod=30 Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.015144 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovn-acl-logging" containerID="cri-o://fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c" gracePeriod=30 Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.078578 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" containerID="cri-o://f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710" gracePeriod=30 Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.183783 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/3.log" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.185992 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovn-acl-logging/0.log" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.186766 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovn-controller/0.log" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.187171 5008 generic.go:334] "Generic (PLEG): container finished" podID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerID="219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e" exitCode=0 Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.187199 5008 generic.go:334] "Generic (PLEG): container finished" podID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerID="ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523" exitCode=0 Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.187208 5008 generic.go:334] "Generic (PLEG): container finished" podID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerID="fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c" exitCode=143 Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.187221 5008 generic.go:334] "Generic (PLEG): container finished" podID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerID="422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634" exitCode=143 Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.187235 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e"} Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.187287 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523"} Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.187308 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c"} Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.187327 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634"} Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.189441 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4xtd_8509b0e0-c914-44a1-a657-ffb4f5a86c18/kube-multus/2.log" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.189867 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4xtd_8509b0e0-c914-44a1-a657-ffb4f5a86c18/kube-multus/1.log" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.189926 5008 generic.go:334] "Generic (PLEG): container finished" podID="8509b0e0-c914-44a1-a657-ffb4f5a86c18" containerID="0d9e736c1f4ff40e4b3fa98fec6877c0380a481efbf6e05b4d566b33c9d31ba7" exitCode=2 Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.189986 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4xtd" event={"ID":"8509b0e0-c914-44a1-a657-ffb4f5a86c18","Type":"ContainerDied","Data":"0d9e736c1f4ff40e4b3fa98fec6877c0380a481efbf6e05b4d566b33c9d31ba7"} Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.190029 5008 scope.go:117] "RemoveContainer" containerID="7771b2db4cb155a3eaf8d9c456340e0c2fdae6e830412dfb95a69482a8fd39fa" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.190684 5008 scope.go:117] "RemoveContainer" containerID="0d9e736c1f4ff40e4b3fa98fec6877c0380a481efbf6e05b4d566b33c9d31ba7" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.191114 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-r4xtd_openshift-multus(8509b0e0-c914-44a1-a657-ffb4f5a86c18)\"" pod="openshift-multus/multus-r4xtd" podUID="8509b0e0-c914-44a1-a657-ffb4f5a86c18" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.377210 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/3.log" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.380324 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovn-acl-logging/0.log" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.380895 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovn-controller/0.log" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.381561 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.430718 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fdqw7"] Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.430897 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.430908 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.430920 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovn-acl-logging" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.430926 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovn-acl-logging" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.430935 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="kube-rbac-proxy-node" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.430941 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="kube-rbac-proxy-node" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.430949 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="nbdb" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.430955 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="nbdb" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.431003 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovn-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431009 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovn-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.431017 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431023 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.431032 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90717c11-47b1-4265-8ea3-9c826850e812" containerName="registry" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431038 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="90717c11-47b1-4265-8ea3-9c826850e812" containerName="registry" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.431047 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431053 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.431060 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="northd" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431066 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="northd" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.431074 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431079 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.431086 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="kubecfg-setup" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431091 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="kubecfg-setup" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.431102 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="sbdb" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431108 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="sbdb" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.431115 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431120 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431199 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431206 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="kube-rbac-proxy-node" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431216 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="90717c11-47b1-4265-8ea3-9c826850e812" containerName="registry" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431224 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="nbdb" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431232 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="sbdb" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431240 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431247 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="northd" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431252 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431260 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovn-acl-logging" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431265 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431272 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431278 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431287 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovn-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: E1126 22:49:41.431373 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.431380 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerName="ovnkube-controller" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.432889 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.452444 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-cni-bin\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.452559 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-kubelet\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.452591 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b81e610c-281d-4697-b94b-43a9e4670378-ovnkube-config\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.452612 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-cni-netd\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.452668 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b81e610c-281d-4697-b94b-43a9e4670378-ovn-node-metrics-cert\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.452692 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-run-ovn\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.452745 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-run-systemd\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.452836 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.452875 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-run-ovn-kubernetes\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.452895 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-var-lib-openvswitch\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.452955 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b81e610c-281d-4697-b94b-43a9e4670378-ovnkube-script-lib\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.453006 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-systemd-units\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.453032 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-log-socket\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.453050 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-node-log\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.453066 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b81e610c-281d-4697-b94b-43a9e4670378-env-overrides\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.453084 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rldd8\" (UniqueName: \"kubernetes.io/projected/b81e610c-281d-4697-b94b-43a9e4670378-kube-api-access-rldd8\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.453175 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-run-openvswitch\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.453231 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-slash\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.453304 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-run-netns\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.453411 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-etc-openvswitch\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554237 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-openvswitch\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554288 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovnkube-config\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554313 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-var-lib-openvswitch\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554343 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-cni-bin\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554358 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-node-log\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554379 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-systemd-units\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554411 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovn-node-metrics-cert\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554429 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sf7w\" (UniqueName: \"kubernetes.io/projected/41e5d1a8-86e0-42e2-a446-8f8938091dc1-kube-api-access-4sf7w\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554445 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-env-overrides\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554471 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554490 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-cni-netd\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554511 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-etc-openvswitch\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554524 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-slash\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554537 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-log-socket\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554563 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovnkube-script-lib\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554580 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-systemd\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554616 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-ovn\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554643 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-run-netns\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554657 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-run-ovn-kubernetes\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554670 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-kubelet\") pod \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\" (UID: \"41e5d1a8-86e0-42e2-a446-8f8938091dc1\") " Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554773 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b81e610c-281d-4697-b94b-43a9e4670378-ovnkube-script-lib\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554794 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-systemd-units\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554819 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-log-socket\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554835 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-node-log\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554851 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b81e610c-281d-4697-b94b-43a9e4670378-env-overrides\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554865 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rldd8\" (UniqueName: \"kubernetes.io/projected/b81e610c-281d-4697-b94b-43a9e4670378-kube-api-access-rldd8\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554885 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-run-openvswitch\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554900 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-slash\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554920 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-run-netns\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554944 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-etc-openvswitch\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.554999 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-cni-bin\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.555028 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-kubelet\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.555045 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b81e610c-281d-4697-b94b-43a9e4670378-ovnkube-config\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.555076 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-cni-netd\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.555097 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b81e610c-281d-4697-b94b-43a9e4670378-ovn-node-metrics-cert\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.555112 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-run-ovn\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.555129 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-run-systemd\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.555154 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.555177 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-run-ovn-kubernetes\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.555197 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-var-lib-openvswitch\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.555272 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-var-lib-openvswitch\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.555332 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.555931 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-etc-openvswitch\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.555990 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.556000 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-systemd-units\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.556024 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-log-socket\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.556044 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.556055 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-node-log\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.556071 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.556093 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-node-log" (OuterVolumeSpecName: "node-log") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.556116 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.556717 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b81e610c-281d-4697-b94b-43a9e4670378-env-overrides\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.556772 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b81e610c-281d-4697-b94b-43a9e4670378-ovnkube-script-lib\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.556806 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.556824 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-slash" (OuterVolumeSpecName: "host-slash") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.556840 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-log-socket" (OuterVolumeSpecName: "log-socket") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.557059 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-run-openvswitch\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.557097 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-slash\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.557357 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.557508 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-run-netns\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.557681 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.557896 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.557935 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.557979 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.558009 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.558048 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-kubelet\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.558083 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-cni-bin\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.558114 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.558149 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-run-systemd\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.558180 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-run-ovn\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.558245 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.558283 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-cni-netd\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.558313 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b81e610c-281d-4697-b94b-43a9e4670378-host-run-ovn-kubernetes\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.558381 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.558778 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b81e610c-281d-4697-b94b-43a9e4670378-ovnkube-config\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.562021 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b81e610c-281d-4697-b94b-43a9e4670378-ovn-node-metrics-cert\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.562412 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.567244 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e5d1a8-86e0-42e2-a446-8f8938091dc1-kube-api-access-4sf7w" (OuterVolumeSpecName: "kube-api-access-4sf7w") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "kube-api-access-4sf7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.572786 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "41e5d1a8-86e0-42e2-a446-8f8938091dc1" (UID: "41e5d1a8-86e0-42e2-a446-8f8938091dc1"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.574131 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rldd8\" (UniqueName: \"kubernetes.io/projected/b81e610c-281d-4697-b94b-43a9e4670378-kube-api-access-rldd8\") pod \"ovnkube-node-fdqw7\" (UID: \"b81e610c-281d-4697-b94b-43a9e4670378\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.656678 5008 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-log-socket\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.656736 5008 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.656759 5008 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.656777 5008 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.656855 5008 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.656872 5008 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.656891 5008 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.656906 5008 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.656922 5008 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.656938 5008 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.656954 5008 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.656993 5008 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-node-log\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.657009 5008 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.657025 5008 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41e5d1a8-86e0-42e2-a446-8f8938091dc1-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.657045 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sf7w\" (UniqueName: \"kubernetes.io/projected/41e5d1a8-86e0-42e2-a446-8f8938091dc1-kube-api-access-4sf7w\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.657067 5008 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41e5d1a8-86e0-42e2-a446-8f8938091dc1-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.657095 5008 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.657112 5008 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.657132 5008 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.657148 5008 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/41e5d1a8-86e0-42e2-a446-8f8938091dc1-host-slash\") on node \"crc\" DevicePath \"\"" Nov 26 22:49:41 crc kubenswrapper[5008]: I1126 22:49:41.746041 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:41 crc kubenswrapper[5008]: W1126 22:49:41.786801 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb81e610c_281d_4697_b94b_43a9e4670378.slice/crio-bd12d9743902e799b4d96d74a95efb100d464e16f1be5b701c5a5110078164b5 WatchSource:0}: Error finding container bd12d9743902e799b4d96d74a95efb100d464e16f1be5b701c5a5110078164b5: Status 404 returned error can't find the container with id bd12d9743902e799b4d96d74a95efb100d464e16f1be5b701c5a5110078164b5 Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.201360 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovnkube-controller/3.log" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.205452 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovn-acl-logging/0.log" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.206190 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zpbmz_41e5d1a8-86e0-42e2-a446-8f8938091dc1/ovn-controller/0.log" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.206754 5008 generic.go:334] "Generic (PLEG): container finished" podID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerID="f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710" exitCode=0 Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.206836 5008 generic.go:334] "Generic (PLEG): container finished" podID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerID="87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591" exitCode=0 Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.206852 5008 generic.go:334] "Generic (PLEG): container finished" podID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerID="83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac" exitCode=0 Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.206864 5008 generic.go:334] "Generic (PLEG): container finished" podID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" containerID="7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f" exitCode=0 Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.206872 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.206995 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710"} Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.207094 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591"} Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.207141 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac"} Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.207170 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f"} Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.207197 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpbmz" event={"ID":"41e5d1a8-86e0-42e2-a446-8f8938091dc1","Type":"ContainerDied","Data":"d174e19a6a937007481b98f22da236d13a22050fd1ec28bad2fcd4fc0d55660f"} Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.207231 5008 scope.go:117] "RemoveContainer" containerID="f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.212813 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4xtd_8509b0e0-c914-44a1-a657-ffb4f5a86c18/kube-multus/2.log" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.215537 5008 generic.go:334] "Generic (PLEG): container finished" podID="b81e610c-281d-4697-b94b-43a9e4670378" containerID="c85a98a63a5e9cdbae8f3db31ee04966d5e59f769eb4daf7a6a3c12ab767c6b6" exitCode=0 Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.215580 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" event={"ID":"b81e610c-281d-4697-b94b-43a9e4670378","Type":"ContainerDied","Data":"c85a98a63a5e9cdbae8f3db31ee04966d5e59f769eb4daf7a6a3c12ab767c6b6"} Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.215613 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" event={"ID":"b81e610c-281d-4697-b94b-43a9e4670378","Type":"ContainerStarted","Data":"bd12d9743902e799b4d96d74a95efb100d464e16f1be5b701c5a5110078164b5"} Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.252655 5008 scope.go:117] "RemoveContainer" containerID="902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.300481 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zpbmz"] Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.304991 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zpbmz"] Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.309168 5008 scope.go:117] "RemoveContainer" containerID="87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.363056 5008 scope.go:117] "RemoveContainer" containerID="83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.384666 5008 scope.go:117] "RemoveContainer" containerID="7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.409222 5008 scope.go:117] "RemoveContainer" containerID="219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.434097 5008 scope.go:117] "RemoveContainer" containerID="ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.466414 5008 scope.go:117] "RemoveContainer" containerID="fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.512148 5008 scope.go:117] "RemoveContainer" containerID="422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.530733 5008 scope.go:117] "RemoveContainer" containerID="b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.572459 5008 scope.go:117] "RemoveContainer" containerID="f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710" Nov 26 22:49:42 crc kubenswrapper[5008]: E1126 22:49:42.573121 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710\": container with ID starting with f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710 not found: ID does not exist" containerID="f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.573343 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710"} err="failed to get container status \"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710\": rpc error: code = NotFound desc = could not find container \"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710\": container with ID starting with f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.573411 5008 scope.go:117] "RemoveContainer" containerID="902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45" Nov 26 22:49:42 crc kubenswrapper[5008]: E1126 22:49:42.574066 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\": container with ID starting with 902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45 not found: ID does not exist" containerID="902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.574127 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45"} err="failed to get container status \"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\": rpc error: code = NotFound desc = could not find container \"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\": container with ID starting with 902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.574168 5008 scope.go:117] "RemoveContainer" containerID="87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591" Nov 26 22:49:42 crc kubenswrapper[5008]: E1126 22:49:42.575485 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\": container with ID starting with 87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591 not found: ID does not exist" containerID="87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.575535 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591"} err="failed to get container status \"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\": rpc error: code = NotFound desc = could not find container \"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\": container with ID starting with 87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.575563 5008 scope.go:117] "RemoveContainer" containerID="83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac" Nov 26 22:49:42 crc kubenswrapper[5008]: E1126 22:49:42.576416 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\": container with ID starting with 83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac not found: ID does not exist" containerID="83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.576457 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac"} err="failed to get container status \"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\": rpc error: code = NotFound desc = could not find container \"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\": container with ID starting with 83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.576486 5008 scope.go:117] "RemoveContainer" containerID="7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f" Nov 26 22:49:42 crc kubenswrapper[5008]: E1126 22:49:42.577149 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\": container with ID starting with 7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f not found: ID does not exist" containerID="7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.577212 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f"} err="failed to get container status \"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\": rpc error: code = NotFound desc = could not find container \"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\": container with ID starting with 7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.577250 5008 scope.go:117] "RemoveContainer" containerID="219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e" Nov 26 22:49:42 crc kubenswrapper[5008]: E1126 22:49:42.577774 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\": container with ID starting with 219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e not found: ID does not exist" containerID="219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.577826 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e"} err="failed to get container status \"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\": rpc error: code = NotFound desc = could not find container \"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\": container with ID starting with 219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.577852 5008 scope.go:117] "RemoveContainer" containerID="ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523" Nov 26 22:49:42 crc kubenswrapper[5008]: E1126 22:49:42.578475 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\": container with ID starting with ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523 not found: ID does not exist" containerID="ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.578522 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523"} err="failed to get container status \"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\": rpc error: code = NotFound desc = could not find container \"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\": container with ID starting with ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.578549 5008 scope.go:117] "RemoveContainer" containerID="fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c" Nov 26 22:49:42 crc kubenswrapper[5008]: E1126 22:49:42.579307 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\": container with ID starting with fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c not found: ID does not exist" containerID="fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.579358 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c"} err="failed to get container status \"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\": rpc error: code = NotFound desc = could not find container \"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\": container with ID starting with fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.579386 5008 scope.go:117] "RemoveContainer" containerID="422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634" Nov 26 22:49:42 crc kubenswrapper[5008]: E1126 22:49:42.580313 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\": container with ID starting with 422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634 not found: ID does not exist" containerID="422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.580356 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634"} err="failed to get container status \"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\": rpc error: code = NotFound desc = could not find container \"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\": container with ID starting with 422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.580385 5008 scope.go:117] "RemoveContainer" containerID="b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586" Nov 26 22:49:42 crc kubenswrapper[5008]: E1126 22:49:42.580995 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\": container with ID starting with b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586 not found: ID does not exist" containerID="b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.581056 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586"} err="failed to get container status \"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\": rpc error: code = NotFound desc = could not find container \"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\": container with ID starting with b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.581093 5008 scope.go:117] "RemoveContainer" containerID="f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.581703 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710"} err="failed to get container status \"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710\": rpc error: code = NotFound desc = could not find container \"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710\": container with ID starting with f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.581743 5008 scope.go:117] "RemoveContainer" containerID="902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.582204 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45"} err="failed to get container status \"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\": rpc error: code = NotFound desc = could not find container \"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\": container with ID starting with 902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.582245 5008 scope.go:117] "RemoveContainer" containerID="87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.582842 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591"} err="failed to get container status \"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\": rpc error: code = NotFound desc = could not find container \"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\": container with ID starting with 87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.582897 5008 scope.go:117] "RemoveContainer" containerID="83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.583486 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac"} err="failed to get container status \"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\": rpc error: code = NotFound desc = could not find container \"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\": container with ID starting with 83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.583524 5008 scope.go:117] "RemoveContainer" containerID="7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.584106 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f"} err="failed to get container status \"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\": rpc error: code = NotFound desc = could not find container \"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\": container with ID starting with 7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.584168 5008 scope.go:117] "RemoveContainer" containerID="219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.584622 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e"} err="failed to get container status \"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\": rpc error: code = NotFound desc = could not find container \"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\": container with ID starting with 219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.584707 5008 scope.go:117] "RemoveContainer" containerID="ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.585193 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523"} err="failed to get container status \"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\": rpc error: code = NotFound desc = could not find container \"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\": container with ID starting with ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.585230 5008 scope.go:117] "RemoveContainer" containerID="fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.586703 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c"} err="failed to get container status \"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\": rpc error: code = NotFound desc = could not find container \"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\": container with ID starting with fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.586757 5008 scope.go:117] "RemoveContainer" containerID="422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.588092 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634"} err="failed to get container status \"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\": rpc error: code = NotFound desc = could not find container \"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\": container with ID starting with 422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.588132 5008 scope.go:117] "RemoveContainer" containerID="b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.589282 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586"} err="failed to get container status \"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\": rpc error: code = NotFound desc = could not find container \"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\": container with ID starting with b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.589321 5008 scope.go:117] "RemoveContainer" containerID="f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.589895 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710"} err="failed to get container status \"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710\": rpc error: code = NotFound desc = could not find container \"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710\": container with ID starting with f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.589952 5008 scope.go:117] "RemoveContainer" containerID="902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.590720 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45"} err="failed to get container status \"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\": rpc error: code = NotFound desc = could not find container \"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\": container with ID starting with 902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.590777 5008 scope.go:117] "RemoveContainer" containerID="87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.591329 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591"} err="failed to get container status \"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\": rpc error: code = NotFound desc = could not find container \"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\": container with ID starting with 87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.591377 5008 scope.go:117] "RemoveContainer" containerID="83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.591894 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac"} err="failed to get container status \"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\": rpc error: code = NotFound desc = could not find container \"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\": container with ID starting with 83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.591938 5008 scope.go:117] "RemoveContainer" containerID="7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.592579 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f"} err="failed to get container status \"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\": rpc error: code = NotFound desc = could not find container \"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\": container with ID starting with 7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.592617 5008 scope.go:117] "RemoveContainer" containerID="219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.593221 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e"} err="failed to get container status \"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\": rpc error: code = NotFound desc = could not find container \"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\": container with ID starting with 219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.593259 5008 scope.go:117] "RemoveContainer" containerID="ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.593893 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523"} err="failed to get container status \"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\": rpc error: code = NotFound desc = could not find container \"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\": container with ID starting with ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.593935 5008 scope.go:117] "RemoveContainer" containerID="fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.594456 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c"} err="failed to get container status \"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\": rpc error: code = NotFound desc = could not find container \"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\": container with ID starting with fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.594497 5008 scope.go:117] "RemoveContainer" containerID="422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.595089 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634"} err="failed to get container status \"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\": rpc error: code = NotFound desc = could not find container \"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\": container with ID starting with 422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.595129 5008 scope.go:117] "RemoveContainer" containerID="b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.595693 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586"} err="failed to get container status \"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\": rpc error: code = NotFound desc = could not find container \"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\": container with ID starting with b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.595742 5008 scope.go:117] "RemoveContainer" containerID="f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.596333 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710"} err="failed to get container status \"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710\": rpc error: code = NotFound desc = could not find container \"f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710\": container with ID starting with f83270f6571d329e9bbeae75c20c3d5861911f047a2d5f2302d1273166528710 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.596373 5008 scope.go:117] "RemoveContainer" containerID="902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.596850 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45"} err="failed to get container status \"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\": rpc error: code = NotFound desc = could not find container \"902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45\": container with ID starting with 902691723e225daedc796a0704111bf9b98fb9864d25d312aaf21cedb70b6e45 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.596884 5008 scope.go:117] "RemoveContainer" containerID="87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.597386 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591"} err="failed to get container status \"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\": rpc error: code = NotFound desc = could not find container \"87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591\": container with ID starting with 87f24abe1e6f55397f087e9cfb292f2a1b82a3ffb25e7fda0aed11e16d041591 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.597432 5008 scope.go:117] "RemoveContainer" containerID="83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.597877 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac"} err="failed to get container status \"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\": rpc error: code = NotFound desc = could not find container \"83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac\": container with ID starting with 83647d5608b88152fc452946719435a85cf26e17876ee44f356b14da61d0eaac not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.597914 5008 scope.go:117] "RemoveContainer" containerID="7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.598328 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f"} err="failed to get container status \"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\": rpc error: code = NotFound desc = could not find container \"7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f\": container with ID starting with 7360b8d70f24dc718a139bc33287cfe7f6be031f96b0c70f5b61f279da7c436f not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.598480 5008 scope.go:117] "RemoveContainer" containerID="219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.599022 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e"} err="failed to get container status \"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\": rpc error: code = NotFound desc = could not find container \"219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e\": container with ID starting with 219540015dd12820244b6f212c9b7e714bc2062e67095a6cf82c39efb9560e9e not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.599061 5008 scope.go:117] "RemoveContainer" containerID="ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.599445 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523"} err="failed to get container status \"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\": rpc error: code = NotFound desc = could not find container \"ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523\": container with ID starting with ee0c2777b91a865bb40323802a8400fc67842c496de4b5e463adbb939ea21523 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.599471 5008 scope.go:117] "RemoveContainer" containerID="fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.599813 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c"} err="failed to get container status \"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\": rpc error: code = NotFound desc = could not find container \"fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c\": container with ID starting with fd3f2294a7a30baac35868fa678c99cdab0c1be841ffcaff7a4189bd22d0df9c not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.599847 5008 scope.go:117] "RemoveContainer" containerID="422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.600214 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634"} err="failed to get container status \"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\": rpc error: code = NotFound desc = could not find container \"422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634\": container with ID starting with 422b2efefd39cb9425160e47849947eb726a552555d21024cfd5c138ba795634 not found: ID does not exist" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.600248 5008 scope.go:117] "RemoveContainer" containerID="b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586" Nov 26 22:49:42 crc kubenswrapper[5008]: I1126 22:49:42.600548 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586"} err="failed to get container status \"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\": rpc error: code = NotFound desc = could not find container \"b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586\": container with ID starting with b599665a1f6ec12f63968fce9ebf40ba8ef01098172412f2129c4e4d3199c586 not found: ID does not exist" Nov 26 22:49:43 crc kubenswrapper[5008]: I1126 22:49:43.229574 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" event={"ID":"b81e610c-281d-4697-b94b-43a9e4670378","Type":"ContainerStarted","Data":"bac662d754d520082c8c47b9d4c4bf9280e196a43af3ea28f941d445e33468fa"} Nov 26 22:49:43 crc kubenswrapper[5008]: I1126 22:49:43.229909 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" event={"ID":"b81e610c-281d-4697-b94b-43a9e4670378","Type":"ContainerStarted","Data":"bde981d9baa89e234606760ab6bf6d1623024ef907b29019a58c5d7b754e8e23"} Nov 26 22:49:43 crc kubenswrapper[5008]: I1126 22:49:43.229921 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" event={"ID":"b81e610c-281d-4697-b94b-43a9e4670378","Type":"ContainerStarted","Data":"3774fccfd541e92c7ff9b478283bf10396aa740af27c2bc4799e4b482c7da4a6"} Nov 26 22:49:43 crc kubenswrapper[5008]: I1126 22:49:43.229937 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" event={"ID":"b81e610c-281d-4697-b94b-43a9e4670378","Type":"ContainerStarted","Data":"3ebf7b0f0145e7b64cbbdf7bf199b63cca0c185f6133e33daaf786dbf586bba6"} Nov 26 22:49:43 crc kubenswrapper[5008]: I1126 22:49:43.229946 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" event={"ID":"b81e610c-281d-4697-b94b-43a9e4670378","Type":"ContainerStarted","Data":"24fe2590c77a3ccdabd1fc01c91b9ee0b96f96d737af70f06a2fa1c60b3eb494"} Nov 26 22:49:43 crc kubenswrapper[5008]: I1126 22:49:43.229955 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" event={"ID":"b81e610c-281d-4697-b94b-43a9e4670378","Type":"ContainerStarted","Data":"abb37fe1e99264f2197fe8c4817b88d2d5b0f6d5cafd0d24dcbe9cbd8232c8de"} Nov 26 22:49:43 crc kubenswrapper[5008]: I1126 22:49:43.531848 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41e5d1a8-86e0-42e2-a446-8f8938091dc1" path="/var/lib/kubelet/pods/41e5d1a8-86e0-42e2-a446-8f8938091dc1/volumes" Nov 26 22:49:46 crc kubenswrapper[5008]: I1126 22:49:46.257454 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" event={"ID":"b81e610c-281d-4697-b94b-43a9e4670378","Type":"ContainerStarted","Data":"fb4fd5e35e9d8ddce9d4d73e2b6451938a2e4b1890912b2fb94d3f1206ab918c"} Nov 26 22:49:48 crc kubenswrapper[5008]: I1126 22:49:48.279164 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" event={"ID":"b81e610c-281d-4697-b94b-43a9e4670378","Type":"ContainerStarted","Data":"d34c8bc15124bcf56eefec0b39236faa108fa77ca445b3c729b69c5ef39ed8d8"} Nov 26 22:49:48 crc kubenswrapper[5008]: I1126 22:49:48.279779 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:48 crc kubenswrapper[5008]: I1126 22:49:48.279811 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:48 crc kubenswrapper[5008]: I1126 22:49:48.321722 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:48 crc kubenswrapper[5008]: I1126 22:49:48.325209 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" podStartSLOduration=7.325194361 podStartE2EDuration="7.325194361s" podCreationTimestamp="2025-11-26 22:49:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:49:48.324474298 +0000 UTC m=+663.737168320" watchObservedRunningTime="2025-11-26 22:49:48.325194361 +0000 UTC m=+663.737888363" Nov 26 22:49:49 crc kubenswrapper[5008]: I1126 22:49:49.291208 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:49 crc kubenswrapper[5008]: I1126 22:49:49.374076 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:49:53 crc kubenswrapper[5008]: I1126 22:49:53.519217 5008 scope.go:117] "RemoveContainer" containerID="0d9e736c1f4ff40e4b3fa98fec6877c0380a481efbf6e05b4d566b33c9d31ba7" Nov 26 22:49:53 crc kubenswrapper[5008]: E1126 22:49:53.520440 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-r4xtd_openshift-multus(8509b0e0-c914-44a1-a657-ffb4f5a86c18)\"" pod="openshift-multus/multus-r4xtd" podUID="8509b0e0-c914-44a1-a657-ffb4f5a86c18" Nov 26 22:50:05 crc kubenswrapper[5008]: I1126 22:50:05.521443 5008 scope.go:117] "RemoveContainer" containerID="0d9e736c1f4ff40e4b3fa98fec6877c0380a481efbf6e05b4d566b33c9d31ba7" Nov 26 22:50:06 crc kubenswrapper[5008]: I1126 22:50:06.398422 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4xtd_8509b0e0-c914-44a1-a657-ffb4f5a86c18/kube-multus/2.log" Nov 26 22:50:06 crc kubenswrapper[5008]: I1126 22:50:06.398833 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4xtd" event={"ID":"8509b0e0-c914-44a1-a657-ffb4f5a86c18","Type":"ContainerStarted","Data":"2971a8d154444bb3b014c2ceac58011b696816a76d5e4957a1c8f11e15cbe1a3"} Nov 26 22:50:11 crc kubenswrapper[5008]: I1126 22:50:11.770107 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdqw7" Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.495953 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz"] Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.498217 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.501539 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.516890 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz"] Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.679512 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/563e6e0b-afe9-409c-b3e6-6bd842412c38-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz\" (UID: \"563e6e0b-afe9-409c-b3e6-6bd842412c38\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.679668 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22v4p\" (UniqueName: \"kubernetes.io/projected/563e6e0b-afe9-409c-b3e6-6bd842412c38-kube-api-access-22v4p\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz\" (UID: \"563e6e0b-afe9-409c-b3e6-6bd842412c38\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.679722 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/563e6e0b-afe9-409c-b3e6-6bd842412c38-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz\" (UID: \"563e6e0b-afe9-409c-b3e6-6bd842412c38\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.781352 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/563e6e0b-afe9-409c-b3e6-6bd842412c38-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz\" (UID: \"563e6e0b-afe9-409c-b3e6-6bd842412c38\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.781476 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22v4p\" (UniqueName: \"kubernetes.io/projected/563e6e0b-afe9-409c-b3e6-6bd842412c38-kube-api-access-22v4p\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz\" (UID: \"563e6e0b-afe9-409c-b3e6-6bd842412c38\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.781527 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/563e6e0b-afe9-409c-b3e6-6bd842412c38-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz\" (UID: \"563e6e0b-afe9-409c-b3e6-6bd842412c38\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.782221 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/563e6e0b-afe9-409c-b3e6-6bd842412c38-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz\" (UID: \"563e6e0b-afe9-409c-b3e6-6bd842412c38\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.782308 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/563e6e0b-afe9-409c-b3e6-6bd842412c38-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz\" (UID: \"563e6e0b-afe9-409c-b3e6-6bd842412c38\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.817491 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22v4p\" (UniqueName: \"kubernetes.io/projected/563e6e0b-afe9-409c-b3e6-6bd842412c38-kube-api-access-22v4p\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz\" (UID: \"563e6e0b-afe9-409c-b3e6-6bd842412c38\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" Nov 26 22:50:16 crc kubenswrapper[5008]: I1126 22:50:16.819383 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" Nov 26 22:50:17 crc kubenswrapper[5008]: I1126 22:50:17.138806 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz"] Nov 26 22:50:17 crc kubenswrapper[5008]: I1126 22:50:17.463352 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" event={"ID":"563e6e0b-afe9-409c-b3e6-6bd842412c38","Type":"ContainerStarted","Data":"fe8c1aca8f1be284d4af737541befebff87ce7e56a030ee5d93a6fb7733ff0f5"} Nov 26 22:50:17 crc kubenswrapper[5008]: I1126 22:50:17.463400 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" event={"ID":"563e6e0b-afe9-409c-b3e6-6bd842412c38","Type":"ContainerStarted","Data":"6d46d566e182cc9fcb284c5e9c1f2f0d0ae6c3452a843ac3bd21a5d4b4c22796"} Nov 26 22:50:18 crc kubenswrapper[5008]: I1126 22:50:18.473007 5008 generic.go:334] "Generic (PLEG): container finished" podID="563e6e0b-afe9-409c-b3e6-6bd842412c38" containerID="fe8c1aca8f1be284d4af737541befebff87ce7e56a030ee5d93a6fb7733ff0f5" exitCode=0 Nov 26 22:50:18 crc kubenswrapper[5008]: I1126 22:50:18.473092 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" event={"ID":"563e6e0b-afe9-409c-b3e6-6bd842412c38","Type":"ContainerDied","Data":"fe8c1aca8f1be284d4af737541befebff87ce7e56a030ee5d93a6fb7733ff0f5"} Nov 26 22:50:18 crc kubenswrapper[5008]: I1126 22:50:18.475140 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 22:50:20 crc kubenswrapper[5008]: I1126 22:50:20.490813 5008 generic.go:334] "Generic (PLEG): container finished" podID="563e6e0b-afe9-409c-b3e6-6bd842412c38" containerID="ba8107e97e6204f1dd3fe3a05de3a4b7cd332afbe27e395939df5a33f1ecb3de" exitCode=0 Nov 26 22:50:20 crc kubenswrapper[5008]: I1126 22:50:20.491023 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" event={"ID":"563e6e0b-afe9-409c-b3e6-6bd842412c38","Type":"ContainerDied","Data":"ba8107e97e6204f1dd3fe3a05de3a4b7cd332afbe27e395939df5a33f1ecb3de"} Nov 26 22:50:21 crc kubenswrapper[5008]: I1126 22:50:21.505948 5008 generic.go:334] "Generic (PLEG): container finished" podID="563e6e0b-afe9-409c-b3e6-6bd842412c38" containerID="edcd2857e57e439873381ee693a652e79143d53267e8e4e38baffeb6a485aa46" exitCode=0 Nov 26 22:50:21 crc kubenswrapper[5008]: I1126 22:50:21.506039 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" event={"ID":"563e6e0b-afe9-409c-b3e6-6bd842412c38","Type":"ContainerDied","Data":"edcd2857e57e439873381ee693a652e79143d53267e8e4e38baffeb6a485aa46"} Nov 26 22:50:22 crc kubenswrapper[5008]: I1126 22:50:22.772168 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" Nov 26 22:50:22 crc kubenswrapper[5008]: I1126 22:50:22.864980 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22v4p\" (UniqueName: \"kubernetes.io/projected/563e6e0b-afe9-409c-b3e6-6bd842412c38-kube-api-access-22v4p\") pod \"563e6e0b-afe9-409c-b3e6-6bd842412c38\" (UID: \"563e6e0b-afe9-409c-b3e6-6bd842412c38\") " Nov 26 22:50:22 crc kubenswrapper[5008]: I1126 22:50:22.865090 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/563e6e0b-afe9-409c-b3e6-6bd842412c38-util\") pod \"563e6e0b-afe9-409c-b3e6-6bd842412c38\" (UID: \"563e6e0b-afe9-409c-b3e6-6bd842412c38\") " Nov 26 22:50:22 crc kubenswrapper[5008]: I1126 22:50:22.865152 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/563e6e0b-afe9-409c-b3e6-6bd842412c38-bundle\") pod \"563e6e0b-afe9-409c-b3e6-6bd842412c38\" (UID: \"563e6e0b-afe9-409c-b3e6-6bd842412c38\") " Nov 26 22:50:22 crc kubenswrapper[5008]: I1126 22:50:22.866376 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/563e6e0b-afe9-409c-b3e6-6bd842412c38-bundle" (OuterVolumeSpecName: "bundle") pod "563e6e0b-afe9-409c-b3e6-6bd842412c38" (UID: "563e6e0b-afe9-409c-b3e6-6bd842412c38"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:50:22 crc kubenswrapper[5008]: I1126 22:50:22.874499 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/563e6e0b-afe9-409c-b3e6-6bd842412c38-kube-api-access-22v4p" (OuterVolumeSpecName: "kube-api-access-22v4p") pod "563e6e0b-afe9-409c-b3e6-6bd842412c38" (UID: "563e6e0b-afe9-409c-b3e6-6bd842412c38"). InnerVolumeSpecName "kube-api-access-22v4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:50:22 crc kubenswrapper[5008]: I1126 22:50:22.966666 5008 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/563e6e0b-afe9-409c-b3e6-6bd842412c38-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:50:22 crc kubenswrapper[5008]: I1126 22:50:22.966726 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22v4p\" (UniqueName: \"kubernetes.io/projected/563e6e0b-afe9-409c-b3e6-6bd842412c38-kube-api-access-22v4p\") on node \"crc\" DevicePath \"\"" Nov 26 22:50:23 crc kubenswrapper[5008]: I1126 22:50:23.030783 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/563e6e0b-afe9-409c-b3e6-6bd842412c38-util" (OuterVolumeSpecName: "util") pod "563e6e0b-afe9-409c-b3e6-6bd842412c38" (UID: "563e6e0b-afe9-409c-b3e6-6bd842412c38"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:50:23 crc kubenswrapper[5008]: I1126 22:50:23.068337 5008 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/563e6e0b-afe9-409c-b3e6-6bd842412c38-util\") on node \"crc\" DevicePath \"\"" Nov 26 22:50:23 crc kubenswrapper[5008]: I1126 22:50:23.522037 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" Nov 26 22:50:23 crc kubenswrapper[5008]: I1126 22:50:23.528613 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69vkxz" event={"ID":"563e6e0b-afe9-409c-b3e6-6bd842412c38","Type":"ContainerDied","Data":"6d46d566e182cc9fcb284c5e9c1f2f0d0ae6c3452a843ac3bd21a5d4b4c22796"} Nov 26 22:50:23 crc kubenswrapper[5008]: I1126 22:50:23.528675 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d46d566e182cc9fcb284c5e9c1f2f0d0ae6c3452a843ac3bd21a5d4b4c22796" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.343287 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj"] Nov 26 22:50:32 crc kubenswrapper[5008]: E1126 22:50:32.343840 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="563e6e0b-afe9-409c-b3e6-6bd842412c38" containerName="extract" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.343852 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="563e6e0b-afe9-409c-b3e6-6bd842412c38" containerName="extract" Nov 26 22:50:32 crc kubenswrapper[5008]: E1126 22:50:32.343863 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="563e6e0b-afe9-409c-b3e6-6bd842412c38" containerName="util" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.343869 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="563e6e0b-afe9-409c-b3e6-6bd842412c38" containerName="util" Nov 26 22:50:32 crc kubenswrapper[5008]: E1126 22:50:32.343888 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="563e6e0b-afe9-409c-b3e6-6bd842412c38" containerName="pull" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.343893 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="563e6e0b-afe9-409c-b3e6-6bd842412c38" containerName="pull" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.344002 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="563e6e0b-afe9-409c-b3e6-6bd842412c38" containerName="extract" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.344333 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.346460 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.346580 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.347210 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lbp6p" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.347405 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.347564 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.359592 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj"] Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.511247 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5-webhook-cert\") pod \"metallb-operator-controller-manager-7d66f7697f-2vlzj\" (UID: \"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5\") " pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.511310 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5-apiservice-cert\") pod \"metallb-operator-controller-manager-7d66f7697f-2vlzj\" (UID: \"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5\") " pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.511478 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gcpl\" (UniqueName: \"kubernetes.io/projected/9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5-kube-api-access-9gcpl\") pod \"metallb-operator-controller-manager-7d66f7697f-2vlzj\" (UID: \"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5\") " pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.613071 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5-webhook-cert\") pod \"metallb-operator-controller-manager-7d66f7697f-2vlzj\" (UID: \"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5\") " pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.613186 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5-apiservice-cert\") pod \"metallb-operator-controller-manager-7d66f7697f-2vlzj\" (UID: \"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5\") " pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.613231 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gcpl\" (UniqueName: \"kubernetes.io/projected/9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5-kube-api-access-9gcpl\") pod \"metallb-operator-controller-manager-7d66f7697f-2vlzj\" (UID: \"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5\") " pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.621690 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5-apiservice-cert\") pod \"metallb-operator-controller-manager-7d66f7697f-2vlzj\" (UID: \"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5\") " pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.628019 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5-webhook-cert\") pod \"metallb-operator-controller-manager-7d66f7697f-2vlzj\" (UID: \"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5\") " pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.650847 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gcpl\" (UniqueName: \"kubernetes.io/projected/9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5-kube-api-access-9gcpl\") pod \"metallb-operator-controller-manager-7d66f7697f-2vlzj\" (UID: \"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5\") " pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.658258 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.768671 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2"] Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.769317 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.776361 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.776617 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.777028 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-cr7cs" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.779427 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2"] Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.916639 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/76f5fead-958c-4762-91b9-0f713c213404-apiservice-cert\") pod \"metallb-operator-webhook-server-7488b4d88c-gpqj2\" (UID: \"76f5fead-958c-4762-91b9-0f713c213404\") " pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.916937 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jwt5\" (UniqueName: \"kubernetes.io/projected/76f5fead-958c-4762-91b9-0f713c213404-kube-api-access-9jwt5\") pod \"metallb-operator-webhook-server-7488b4d88c-gpqj2\" (UID: \"76f5fead-958c-4762-91b9-0f713c213404\") " pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" Nov 26 22:50:32 crc kubenswrapper[5008]: I1126 22:50:32.917035 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/76f5fead-958c-4762-91b9-0f713c213404-webhook-cert\") pod \"metallb-operator-webhook-server-7488b4d88c-gpqj2\" (UID: \"76f5fead-958c-4762-91b9-0f713c213404\") " pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" Nov 26 22:50:33 crc kubenswrapper[5008]: I1126 22:50:33.018058 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/76f5fead-958c-4762-91b9-0f713c213404-apiservice-cert\") pod \"metallb-operator-webhook-server-7488b4d88c-gpqj2\" (UID: \"76f5fead-958c-4762-91b9-0f713c213404\") " pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" Nov 26 22:50:33 crc kubenswrapper[5008]: I1126 22:50:33.018103 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jwt5\" (UniqueName: \"kubernetes.io/projected/76f5fead-958c-4762-91b9-0f713c213404-kube-api-access-9jwt5\") pod \"metallb-operator-webhook-server-7488b4d88c-gpqj2\" (UID: \"76f5fead-958c-4762-91b9-0f713c213404\") " pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" Nov 26 22:50:33 crc kubenswrapper[5008]: I1126 22:50:33.018161 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/76f5fead-958c-4762-91b9-0f713c213404-webhook-cert\") pod \"metallb-operator-webhook-server-7488b4d88c-gpqj2\" (UID: \"76f5fead-958c-4762-91b9-0f713c213404\") " pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" Nov 26 22:50:33 crc kubenswrapper[5008]: I1126 22:50:33.023201 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/76f5fead-958c-4762-91b9-0f713c213404-apiservice-cert\") pod \"metallb-operator-webhook-server-7488b4d88c-gpqj2\" (UID: \"76f5fead-958c-4762-91b9-0f713c213404\") " pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" Nov 26 22:50:33 crc kubenswrapper[5008]: I1126 22:50:33.023420 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/76f5fead-958c-4762-91b9-0f713c213404-webhook-cert\") pod \"metallb-operator-webhook-server-7488b4d88c-gpqj2\" (UID: \"76f5fead-958c-4762-91b9-0f713c213404\") " pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" Nov 26 22:50:33 crc kubenswrapper[5008]: I1126 22:50:33.053815 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jwt5\" (UniqueName: \"kubernetes.io/projected/76f5fead-958c-4762-91b9-0f713c213404-kube-api-access-9jwt5\") pod \"metallb-operator-webhook-server-7488b4d88c-gpqj2\" (UID: \"76f5fead-958c-4762-91b9-0f713c213404\") " pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" Nov 26 22:50:33 crc kubenswrapper[5008]: I1126 22:50:33.099038 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" Nov 26 22:50:33 crc kubenswrapper[5008]: I1126 22:50:33.133396 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj"] Nov 26 22:50:33 crc kubenswrapper[5008]: W1126 22:50:33.138818 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b33e0c3_fdbc_41ff_8c6c_8581e4e641c5.slice/crio-8e5092b707b51df1fd533bee02ebf7aa1ef6d7ef01a016e2987a44cb46f95b05 WatchSource:0}: Error finding container 8e5092b707b51df1fd533bee02ebf7aa1ef6d7ef01a016e2987a44cb46f95b05: Status 404 returned error can't find the container with id 8e5092b707b51df1fd533bee02ebf7aa1ef6d7ef01a016e2987a44cb46f95b05 Nov 26 22:50:33 crc kubenswrapper[5008]: I1126 22:50:33.584538 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" event={"ID":"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5","Type":"ContainerStarted","Data":"8e5092b707b51df1fd533bee02ebf7aa1ef6d7ef01a016e2987a44cb46f95b05"} Nov 26 22:50:33 crc kubenswrapper[5008]: I1126 22:50:33.596335 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2"] Nov 26 22:50:34 crc kubenswrapper[5008]: I1126 22:50:34.589581 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" event={"ID":"76f5fead-958c-4762-91b9-0f713c213404","Type":"ContainerStarted","Data":"11376d1c22a6e9b5032a3fb93a1ff63d7c453a4db58d49b6516d60b52c964bda"} Nov 26 22:50:40 crc kubenswrapper[5008]: I1126 22:50:40.625095 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" event={"ID":"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5","Type":"ContainerStarted","Data":"52bbce5f4fecf343d959649894234c07502df7f01fc7c536f0004f31828b5d98"} Nov 26 22:50:40 crc kubenswrapper[5008]: I1126 22:50:40.627169 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" event={"ID":"76f5fead-958c-4762-91b9-0f713c213404","Type":"ContainerStarted","Data":"765f1e7d50998c2484b7c206a79459420b2f716724bbe2e41bf5f22d8ce9daa5"} Nov 26 22:50:40 crc kubenswrapper[5008]: I1126 22:50:40.627408 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 22:50:40 crc kubenswrapper[5008]: I1126 22:50:40.627585 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" Nov 26 22:50:40 crc kubenswrapper[5008]: I1126 22:50:40.660556 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" podStartSLOduration=2.304813306 podStartE2EDuration="8.660527937s" podCreationTimestamp="2025-11-26 22:50:32 +0000 UTC" firstStartedPulling="2025-11-26 22:50:33.147120158 +0000 UTC m=+708.559814200" lastFinishedPulling="2025-11-26 22:50:39.502834839 +0000 UTC m=+714.915528831" observedRunningTime="2025-11-26 22:50:40.654936462 +0000 UTC m=+716.067630504" watchObservedRunningTime="2025-11-26 22:50:40.660527937 +0000 UTC m=+716.073221979" Nov 26 22:50:40 crc kubenswrapper[5008]: I1126 22:50:40.677317 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" podStartSLOduration=2.776440538 podStartE2EDuration="8.677297794s" podCreationTimestamp="2025-11-26 22:50:32 +0000 UTC" firstStartedPulling="2025-11-26 22:50:33.603595624 +0000 UTC m=+709.016289626" lastFinishedPulling="2025-11-26 22:50:39.50445288 +0000 UTC m=+714.917146882" observedRunningTime="2025-11-26 22:50:40.674141155 +0000 UTC m=+716.086835197" watchObservedRunningTime="2025-11-26 22:50:40.677297794 +0000 UTC m=+716.089991796" Nov 26 22:50:53 crc kubenswrapper[5008]: I1126 22:50:53.110004 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7488b4d88c-gpqj2" Nov 26 22:50:59 crc kubenswrapper[5008]: I1126 22:50:59.281303 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:50:59 crc kubenswrapper[5008]: I1126 22:50:59.281716 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:51:12 crc kubenswrapper[5008]: I1126 22:51:12.663409 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.339484 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h"] Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.340480 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.342159 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-288r6" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.342950 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.344411 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-h8srm"] Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.347223 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.349780 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.351188 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.352463 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h"] Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.409588 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a0701083-9676-4774-a96e-f6dbe0a67c7b-metrics\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.409652 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a0701083-9676-4774-a96e-f6dbe0a67c7b-frr-conf\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.409689 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a0701083-9676-4774-a96e-f6dbe0a67c7b-reloader\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.409728 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a0701083-9676-4774-a96e-f6dbe0a67c7b-frr-sockets\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.409769 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3509ccc3-93df-42aa-9d14-639703b9531e-cert\") pod \"frr-k8s-webhook-server-6998585d5-hnx8h\" (UID: \"3509ccc3-93df-42aa-9d14-639703b9531e\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.409791 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7lz2\" (UniqueName: \"kubernetes.io/projected/3509ccc3-93df-42aa-9d14-639703b9531e-kube-api-access-g7lz2\") pod \"frr-k8s-webhook-server-6998585d5-hnx8h\" (UID: \"3509ccc3-93df-42aa-9d14-639703b9531e\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.409815 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6p42\" (UniqueName: \"kubernetes.io/projected/a0701083-9676-4774-a96e-f6dbe0a67c7b-kube-api-access-t6p42\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.409845 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0701083-9676-4774-a96e-f6dbe0a67c7b-metrics-certs\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.409869 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a0701083-9676-4774-a96e-f6dbe0a67c7b-frr-startup\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.428908 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-l5slf"] Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.429723 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-l5slf" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.432646 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.432678 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-xw689" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.432720 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-hwwh2"] Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.432770 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.433378 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.433762 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-hwwh2" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.434745 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.444984 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-hwwh2"] Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510541 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0701083-9676-4774-a96e-f6dbe0a67c7b-metrics-certs\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510594 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a0701083-9676-4774-a96e-f6dbe0a67c7b-frr-startup\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510628 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lrrt\" (UniqueName: \"kubernetes.io/projected/9b1a5b0b-c494-499d-9339-83c0bc6dc105-kube-api-access-7lrrt\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510649 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a0701083-9676-4774-a96e-f6dbe0a67c7b-metrics\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510671 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a0701083-9676-4774-a96e-f6dbe0a67c7b-frr-conf\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510692 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a0701083-9676-4774-a96e-f6dbe0a67c7b-reloader\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510740 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b1a5b0b-c494-499d-9339-83c0bc6dc105-metrics-certs\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510761 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9b1a5b0b-c494-499d-9339-83c0bc6dc105-metallb-excludel2\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510782 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkk5l\" (UniqueName: \"kubernetes.io/projected/b41ecab0-6948-40a6-a9df-b4ff781a8122-kube-api-access-dkk5l\") pod \"controller-6c7b4b5f48-hwwh2\" (UID: \"b41ecab0-6948-40a6-a9df-b4ff781a8122\") " pod="metallb-system/controller-6c7b4b5f48-hwwh2" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510800 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9b1a5b0b-c494-499d-9339-83c0bc6dc105-memberlist\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510820 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b41ecab0-6948-40a6-a9df-b4ff781a8122-cert\") pod \"controller-6c7b4b5f48-hwwh2\" (UID: \"b41ecab0-6948-40a6-a9df-b4ff781a8122\") " pod="metallb-system/controller-6c7b4b5f48-hwwh2" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510840 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b41ecab0-6948-40a6-a9df-b4ff781a8122-metrics-certs\") pod \"controller-6c7b4b5f48-hwwh2\" (UID: \"b41ecab0-6948-40a6-a9df-b4ff781a8122\") " pod="metallb-system/controller-6c7b4b5f48-hwwh2" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510858 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a0701083-9676-4774-a96e-f6dbe0a67c7b-frr-sockets\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510891 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3509ccc3-93df-42aa-9d14-639703b9531e-cert\") pod \"frr-k8s-webhook-server-6998585d5-hnx8h\" (UID: \"3509ccc3-93df-42aa-9d14-639703b9531e\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510908 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7lz2\" (UniqueName: \"kubernetes.io/projected/3509ccc3-93df-42aa-9d14-639703b9531e-kube-api-access-g7lz2\") pod \"frr-k8s-webhook-server-6998585d5-hnx8h\" (UID: \"3509ccc3-93df-42aa-9d14-639703b9531e\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.510926 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6p42\" (UniqueName: \"kubernetes.io/projected/a0701083-9676-4774-a96e-f6dbe0a67c7b-kube-api-access-t6p42\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.511136 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a0701083-9676-4774-a96e-f6dbe0a67c7b-frr-conf\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.511335 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a0701083-9676-4774-a96e-f6dbe0a67c7b-metrics\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: E1126 22:51:13.511435 5008 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Nov 26 22:51:13 crc kubenswrapper[5008]: E1126 22:51:13.511483 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3509ccc3-93df-42aa-9d14-639703b9531e-cert podName:3509ccc3-93df-42aa-9d14-639703b9531e nodeName:}" failed. No retries permitted until 2025-11-26 22:51:14.011467568 +0000 UTC m=+749.424161560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3509ccc3-93df-42aa-9d14-639703b9531e-cert") pod "frr-k8s-webhook-server-6998585d5-hnx8h" (UID: "3509ccc3-93df-42aa-9d14-639703b9531e") : secret "frr-k8s-webhook-server-cert" not found Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.511518 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a0701083-9676-4774-a96e-f6dbe0a67c7b-reloader\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.511641 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a0701083-9676-4774-a96e-f6dbe0a67c7b-frr-sockets\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.511681 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a0701083-9676-4774-a96e-f6dbe0a67c7b-frr-startup\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.518359 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0701083-9676-4774-a96e-f6dbe0a67c7b-metrics-certs\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.527674 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6p42\" (UniqueName: \"kubernetes.io/projected/a0701083-9676-4774-a96e-f6dbe0a67c7b-kube-api-access-t6p42\") pod \"frr-k8s-h8srm\" (UID: \"a0701083-9676-4774-a96e-f6dbe0a67c7b\") " pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.547094 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7lz2\" (UniqueName: \"kubernetes.io/projected/3509ccc3-93df-42aa-9d14-639703b9531e-kube-api-access-g7lz2\") pod \"frr-k8s-webhook-server-6998585d5-hnx8h\" (UID: \"3509ccc3-93df-42aa-9d14-639703b9531e\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.611694 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lrrt\" (UniqueName: \"kubernetes.io/projected/9b1a5b0b-c494-499d-9339-83c0bc6dc105-kube-api-access-7lrrt\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.611777 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b1a5b0b-c494-499d-9339-83c0bc6dc105-metrics-certs\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.611802 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9b1a5b0b-c494-499d-9339-83c0bc6dc105-metallb-excludel2\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.611828 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkk5l\" (UniqueName: \"kubernetes.io/projected/b41ecab0-6948-40a6-a9df-b4ff781a8122-kube-api-access-dkk5l\") pod \"controller-6c7b4b5f48-hwwh2\" (UID: \"b41ecab0-6948-40a6-a9df-b4ff781a8122\") " pod="metallb-system/controller-6c7b4b5f48-hwwh2" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.611851 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9b1a5b0b-c494-499d-9339-83c0bc6dc105-memberlist\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.611870 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b41ecab0-6948-40a6-a9df-b4ff781a8122-cert\") pod \"controller-6c7b4b5f48-hwwh2\" (UID: \"b41ecab0-6948-40a6-a9df-b4ff781a8122\") " pod="metallb-system/controller-6c7b4b5f48-hwwh2" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.611893 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b41ecab0-6948-40a6-a9df-b4ff781a8122-metrics-certs\") pod \"controller-6c7b4b5f48-hwwh2\" (UID: \"b41ecab0-6948-40a6-a9df-b4ff781a8122\") " pod="metallb-system/controller-6c7b4b5f48-hwwh2" Nov 26 22:51:13 crc kubenswrapper[5008]: E1126 22:51:13.612807 5008 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 26 22:51:13 crc kubenswrapper[5008]: E1126 22:51:13.612873 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b1a5b0b-c494-499d-9339-83c0bc6dc105-memberlist podName:9b1a5b0b-c494-499d-9339-83c0bc6dc105 nodeName:}" failed. No retries permitted until 2025-11-26 22:51:14.112851472 +0000 UTC m=+749.525545584 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9b1a5b0b-c494-499d-9339-83c0bc6dc105-memberlist") pod "speaker-l5slf" (UID: "9b1a5b0b-c494-499d-9339-83c0bc6dc105") : secret "metallb-memberlist" not found Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.613331 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9b1a5b0b-c494-499d-9339-83c0bc6dc105-metallb-excludel2\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.615674 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.615900 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b1a5b0b-c494-499d-9339-83c0bc6dc105-metrics-certs\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.616381 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b41ecab0-6948-40a6-a9df-b4ff781a8122-metrics-certs\") pod \"controller-6c7b4b5f48-hwwh2\" (UID: \"b41ecab0-6948-40a6-a9df-b4ff781a8122\") " pod="metallb-system/controller-6c7b4b5f48-hwwh2" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.626462 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b41ecab0-6948-40a6-a9df-b4ff781a8122-cert\") pod \"controller-6c7b4b5f48-hwwh2\" (UID: \"b41ecab0-6948-40a6-a9df-b4ff781a8122\") " pod="metallb-system/controller-6c7b4b5f48-hwwh2" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.627179 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lrrt\" (UniqueName: \"kubernetes.io/projected/9b1a5b0b-c494-499d-9339-83c0bc6dc105-kube-api-access-7lrrt\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.629841 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkk5l\" (UniqueName: \"kubernetes.io/projected/b41ecab0-6948-40a6-a9df-b4ff781a8122-kube-api-access-dkk5l\") pod \"controller-6c7b4b5f48-hwwh2\" (UID: \"b41ecab0-6948-40a6-a9df-b4ff781a8122\") " pod="metallb-system/controller-6c7b4b5f48-hwwh2" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.668255 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.756216 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-hwwh2" Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.835789 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h8srm" event={"ID":"a0701083-9676-4774-a96e-f6dbe0a67c7b","Type":"ContainerStarted","Data":"f9cb6b060a5a4bfc749277acc03794065baed666da1f22a9a52fc86b231e56e4"} Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.907796 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xsdh6"] Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.908068 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" podUID="607a3fbf-3586-4c55-894a-f9107fc5679d" containerName="controller-manager" containerID="cri-o://d0fe43aee1d2840e2217b9bf56dda5e3e93dd216d7e3dd79d5d5f6e4cdc9aed2" gracePeriod=30 Nov 26 22:51:13 crc kubenswrapper[5008]: I1126 22:51:13.992619 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-hwwh2"] Nov 26 22:51:14 crc kubenswrapper[5008]: W1126 22:51:13.999973 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb41ecab0_6948_40a6_a9df_b4ff781a8122.slice/crio-4d92388d02b22c6bd46e6751be2de88acc40812ae976a43b50ae3b189febc372 WatchSource:0}: Error finding container 4d92388d02b22c6bd46e6751be2de88acc40812ae976a43b50ae3b189febc372: Status 404 returned error can't find the container with id 4d92388d02b22c6bd46e6751be2de88acc40812ae976a43b50ae3b189febc372 Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.007401 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q"] Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.009445 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" podUID="23eb76e7-84b0-4e28-8efb-a0454bd41d1e" containerName="route-controller-manager" containerID="cri-o://8ee9dc36c53f3d5248d1da9e164bc4e27d2459ae7def474178d8e045201596cd" gracePeriod=30 Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.017373 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3509ccc3-93df-42aa-9d14-639703b9531e-cert\") pod \"frr-k8s-webhook-server-6998585d5-hnx8h\" (UID: \"3509ccc3-93df-42aa-9d14-639703b9531e\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.029136 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3509ccc3-93df-42aa-9d14-639703b9531e-cert\") pod \"frr-k8s-webhook-server-6998585d5-hnx8h\" (UID: \"3509ccc3-93df-42aa-9d14-639703b9531e\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.118729 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9b1a5b0b-c494-499d-9339-83c0bc6dc105-memberlist\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:14 crc kubenswrapper[5008]: E1126 22:51:14.118922 5008 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 26 22:51:14 crc kubenswrapper[5008]: E1126 22:51:14.118984 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b1a5b0b-c494-499d-9339-83c0bc6dc105-memberlist podName:9b1a5b0b-c494-499d-9339-83c0bc6dc105 nodeName:}" failed. No retries permitted until 2025-11-26 22:51:15.118959122 +0000 UTC m=+750.531653124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9b1a5b0b-c494-499d-9339-83c0bc6dc105-memberlist") pod "speaker-l5slf" (UID: "9b1a5b0b-c494-499d-9339-83c0bc6dc105") : secret "metallb-memberlist" not found Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.252692 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.257561 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.320940 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-config\") pod \"607a3fbf-3586-4c55-894a-f9107fc5679d\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.321004 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607a3fbf-3586-4c55-894a-f9107fc5679d-serving-cert\") pod \"607a3fbf-3586-4c55-894a-f9107fc5679d\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.321031 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np7b2\" (UniqueName: \"kubernetes.io/projected/607a3fbf-3586-4c55-894a-f9107fc5679d-kube-api-access-np7b2\") pod \"607a3fbf-3586-4c55-894a-f9107fc5679d\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.321089 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-client-ca\") pod \"607a3fbf-3586-4c55-894a-f9107fc5679d\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.321160 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-proxy-ca-bundles\") pod \"607a3fbf-3586-4c55-894a-f9107fc5679d\" (UID: \"607a3fbf-3586-4c55-894a-f9107fc5679d\") " Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.322054 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "607a3fbf-3586-4c55-894a-f9107fc5679d" (UID: "607a3fbf-3586-4c55-894a-f9107fc5679d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.322084 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-client-ca" (OuterVolumeSpecName: "client-ca") pod "607a3fbf-3586-4c55-894a-f9107fc5679d" (UID: "607a3fbf-3586-4c55-894a-f9107fc5679d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.322416 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-config" (OuterVolumeSpecName: "config") pod "607a3fbf-3586-4c55-894a-f9107fc5679d" (UID: "607a3fbf-3586-4c55-894a-f9107fc5679d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.324804 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/607a3fbf-3586-4c55-894a-f9107fc5679d-kube-api-access-np7b2" (OuterVolumeSpecName: "kube-api-access-np7b2") pod "607a3fbf-3586-4c55-894a-f9107fc5679d" (UID: "607a3fbf-3586-4c55-894a-f9107fc5679d"). InnerVolumeSpecName "kube-api-access-np7b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.326329 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/607a3fbf-3586-4c55-894a-f9107fc5679d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "607a3fbf-3586-4c55-894a-f9107fc5679d" (UID: "607a3fbf-3586-4c55-894a-f9107fc5679d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.344075 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.422374 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-client-ca\") pod \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.422455 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-serving-cert\") pod \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.422478 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-config\") pod \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.422527 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p22sq\" (UniqueName: \"kubernetes.io/projected/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-kube-api-access-p22sq\") pod \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\" (UID: \"23eb76e7-84b0-4e28-8efb-a0454bd41d1e\") " Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.422740 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.422754 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.422764 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607a3fbf-3586-4c55-894a-f9107fc5679d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.422772 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607a3fbf-3586-4c55-894a-f9107fc5679d-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.422781 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np7b2\" (UniqueName: \"kubernetes.io/projected/607a3fbf-3586-4c55-894a-f9107fc5679d-kube-api-access-np7b2\") on node \"crc\" DevicePath \"\"" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.423158 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-client-ca" (OuterVolumeSpecName: "client-ca") pod "23eb76e7-84b0-4e28-8efb-a0454bd41d1e" (UID: "23eb76e7-84b0-4e28-8efb-a0454bd41d1e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.423834 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-config" (OuterVolumeSpecName: "config") pod "23eb76e7-84b0-4e28-8efb-a0454bd41d1e" (UID: "23eb76e7-84b0-4e28-8efb-a0454bd41d1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.428463 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-kube-api-access-p22sq" (OuterVolumeSpecName: "kube-api-access-p22sq") pod "23eb76e7-84b0-4e28-8efb-a0454bd41d1e" (UID: "23eb76e7-84b0-4e28-8efb-a0454bd41d1e"). InnerVolumeSpecName "kube-api-access-p22sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.430354 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "23eb76e7-84b0-4e28-8efb-a0454bd41d1e" (UID: "23eb76e7-84b0-4e28-8efb-a0454bd41d1e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.524428 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-config\") on node \"crc\" DevicePath \"\"" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.524455 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p22sq\" (UniqueName: \"kubernetes.io/projected/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-kube-api-access-p22sq\") on node \"crc\" DevicePath \"\"" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.524467 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.524478 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23eb76e7-84b0-4e28-8efb-a0454bd41d1e-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.555149 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h"] Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.844280 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" event={"ID":"3509ccc3-93df-42aa-9d14-639703b9531e","Type":"ContainerStarted","Data":"7f3fe801684abfa289e5bb4a0f03795804d16756abf7d449380c4f3c641f68df"} Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.846627 5008 generic.go:334] "Generic (PLEG): container finished" podID="607a3fbf-3586-4c55-894a-f9107fc5679d" containerID="d0fe43aee1d2840e2217b9bf56dda5e3e93dd216d7e3dd79d5d5f6e4cdc9aed2" exitCode=0 Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.846674 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.846729 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" event={"ID":"607a3fbf-3586-4c55-894a-f9107fc5679d","Type":"ContainerDied","Data":"d0fe43aee1d2840e2217b9bf56dda5e3e93dd216d7e3dd79d5d5f6e4cdc9aed2"} Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.846772 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xsdh6" event={"ID":"607a3fbf-3586-4c55-894a-f9107fc5679d","Type":"ContainerDied","Data":"8f9e38af9bb0ca5c061f3d6533bad35a1b8ba2db7ba66ea3e4b60e4edb340e19"} Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.846796 5008 scope.go:117] "RemoveContainer" containerID="d0fe43aee1d2840e2217b9bf56dda5e3e93dd216d7e3dd79d5d5f6e4cdc9aed2" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.850142 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-hwwh2" event={"ID":"b41ecab0-6948-40a6-a9df-b4ff781a8122","Type":"ContainerStarted","Data":"5b5fe603cf75fdb1e863976c3be0827bd56c88a4846c61bb10d25024743a6bc8"} Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.850212 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-hwwh2" event={"ID":"b41ecab0-6948-40a6-a9df-b4ff781a8122","Type":"ContainerStarted","Data":"4d92388d02b22c6bd46e6751be2de88acc40812ae976a43b50ae3b189febc372"} Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.851457 5008 generic.go:334] "Generic (PLEG): container finished" podID="23eb76e7-84b0-4e28-8efb-a0454bd41d1e" containerID="8ee9dc36c53f3d5248d1da9e164bc4e27d2459ae7def474178d8e045201596cd" exitCode=0 Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.851493 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" event={"ID":"23eb76e7-84b0-4e28-8efb-a0454bd41d1e","Type":"ContainerDied","Data":"8ee9dc36c53f3d5248d1da9e164bc4e27d2459ae7def474178d8e045201596cd"} Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.851514 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" event={"ID":"23eb76e7-84b0-4e28-8efb-a0454bd41d1e","Type":"ContainerDied","Data":"87f7ef166a4b8a65d75fc966b1daad67b903b570d9bdf25832e2b1990ba16989"} Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.851536 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.867606 5008 scope.go:117] "RemoveContainer" containerID="d0fe43aee1d2840e2217b9bf56dda5e3e93dd216d7e3dd79d5d5f6e4cdc9aed2" Nov 26 22:51:14 crc kubenswrapper[5008]: E1126 22:51:14.868055 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0fe43aee1d2840e2217b9bf56dda5e3e93dd216d7e3dd79d5d5f6e4cdc9aed2\": container with ID starting with d0fe43aee1d2840e2217b9bf56dda5e3e93dd216d7e3dd79d5d5f6e4cdc9aed2 not found: ID does not exist" containerID="d0fe43aee1d2840e2217b9bf56dda5e3e93dd216d7e3dd79d5d5f6e4cdc9aed2" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.868125 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0fe43aee1d2840e2217b9bf56dda5e3e93dd216d7e3dd79d5d5f6e4cdc9aed2"} err="failed to get container status \"d0fe43aee1d2840e2217b9bf56dda5e3e93dd216d7e3dd79d5d5f6e4cdc9aed2\": rpc error: code = NotFound desc = could not find container \"d0fe43aee1d2840e2217b9bf56dda5e3e93dd216d7e3dd79d5d5f6e4cdc9aed2\": container with ID starting with d0fe43aee1d2840e2217b9bf56dda5e3e93dd216d7e3dd79d5d5f6e4cdc9aed2 not found: ID does not exist" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.868143 5008 scope.go:117] "RemoveContainer" containerID="8ee9dc36c53f3d5248d1da9e164bc4e27d2459ae7def474178d8e045201596cd" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.877049 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xsdh6"] Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.884870 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xsdh6"] Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.890624 5008 scope.go:117] "RemoveContainer" containerID="8ee9dc36c53f3d5248d1da9e164bc4e27d2459ae7def474178d8e045201596cd" Nov 26 22:51:14 crc kubenswrapper[5008]: E1126 22:51:14.891054 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ee9dc36c53f3d5248d1da9e164bc4e27d2459ae7def474178d8e045201596cd\": container with ID starting with 8ee9dc36c53f3d5248d1da9e164bc4e27d2459ae7def474178d8e045201596cd not found: ID does not exist" containerID="8ee9dc36c53f3d5248d1da9e164bc4e27d2459ae7def474178d8e045201596cd" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.891082 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ee9dc36c53f3d5248d1da9e164bc4e27d2459ae7def474178d8e045201596cd"} err="failed to get container status \"8ee9dc36c53f3d5248d1da9e164bc4e27d2459ae7def474178d8e045201596cd\": rpc error: code = NotFound desc = could not find container \"8ee9dc36c53f3d5248d1da9e164bc4e27d2459ae7def474178d8e045201596cd\": container with ID starting with 8ee9dc36c53f3d5248d1da9e164bc4e27d2459ae7def474178d8e045201596cd not found: ID does not exist" Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.899346 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q"] Nov 26 22:51:14 crc kubenswrapper[5008]: I1126 22:51:14.902807 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qz6q"] Nov 26 22:51:15 crc kubenswrapper[5008]: I1126 22:51:15.133645 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9b1a5b0b-c494-499d-9339-83c0bc6dc105-memberlist\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:15 crc kubenswrapper[5008]: I1126 22:51:15.148518 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9b1a5b0b-c494-499d-9339-83c0bc6dc105-memberlist\") pod \"speaker-l5slf\" (UID: \"9b1a5b0b-c494-499d-9339-83c0bc6dc105\") " pod="metallb-system/speaker-l5slf" Nov 26 22:51:15 crc kubenswrapper[5008]: I1126 22:51:15.246506 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-l5slf" Nov 26 22:51:15 crc kubenswrapper[5008]: W1126 22:51:15.268066 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b1a5b0b_c494_499d_9339_83c0bc6dc105.slice/crio-f39a85361a96e79451a37455d05c23d70c31f6463c70e176f9641375b67775ca WatchSource:0}: Error finding container f39a85361a96e79451a37455d05c23d70c31f6463c70e176f9641375b67775ca: Status 404 returned error can't find the container with id f39a85361a96e79451a37455d05c23d70c31f6463c70e176f9641375b67775ca Nov 26 22:51:15 crc kubenswrapper[5008]: I1126 22:51:15.527867 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23eb76e7-84b0-4e28-8efb-a0454bd41d1e" path="/var/lib/kubelet/pods/23eb76e7-84b0-4e28-8efb-a0454bd41d1e/volumes" Nov 26 22:51:15 crc kubenswrapper[5008]: I1126 22:51:15.528800 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="607a3fbf-3586-4c55-894a-f9107fc5679d" path="/var/lib/kubelet/pods/607a3fbf-3586-4c55-894a-f9107fc5679d/volumes" Nov 26 22:51:15 crc kubenswrapper[5008]: I1126 22:51:15.872419 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l5slf" event={"ID":"9b1a5b0b-c494-499d-9339-83c0bc6dc105","Type":"ContainerStarted","Data":"afe7091a58486ae2f2e9fac645975e97b70d47e38807e1b79049d85244dcb6c5"} Nov 26 22:51:15 crc kubenswrapper[5008]: I1126 22:51:15.872462 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l5slf" event={"ID":"9b1a5b0b-c494-499d-9339-83c0bc6dc105","Type":"ContainerStarted","Data":"f39a85361a96e79451a37455d05c23d70c31f6463c70e176f9641375b67775ca"} Nov 26 22:51:15 crc kubenswrapper[5008]: I1126 22:51:15.996947 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b65967df5-j6dgt"] Nov 26 22:51:15 crc kubenswrapper[5008]: E1126 22:51:15.997168 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607a3fbf-3586-4c55-894a-f9107fc5679d" containerName="controller-manager" Nov 26 22:51:15 crc kubenswrapper[5008]: I1126 22:51:15.997182 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="607a3fbf-3586-4c55-894a-f9107fc5679d" containerName="controller-manager" Nov 26 22:51:15 crc kubenswrapper[5008]: E1126 22:51:15.997194 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23eb76e7-84b0-4e28-8efb-a0454bd41d1e" containerName="route-controller-manager" Nov 26 22:51:15 crc kubenswrapper[5008]: I1126 22:51:15.997201 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="23eb76e7-84b0-4e28-8efb-a0454bd41d1e" containerName="route-controller-manager" Nov 26 22:51:15 crc kubenswrapper[5008]: I1126 22:51:15.997285 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="607a3fbf-3586-4c55-894a-f9107fc5679d" containerName="controller-manager" Nov 26 22:51:15 crc kubenswrapper[5008]: I1126 22:51:15.997300 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="23eb76e7-84b0-4e28-8efb-a0454bd41d1e" containerName="route-controller-manager" Nov 26 22:51:15 crc kubenswrapper[5008]: I1126 22:51:15.997599 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.000426 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.000508 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.000515 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.001671 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh"] Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.002107 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.003646 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.003848 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.010634 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.015552 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh"] Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.020195 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.020405 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.020545 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.020658 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.020785 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.021050 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.021111 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.024426 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b65967df5-j6dgt"] Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.047720 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/802c4a57-0a14-457c-a944-50c415f4cb5b-client-ca\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.047782 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/130fa4b8-18f1-4427-9b81-4f284f2eaeb8-client-ca\") pod \"route-controller-manager-849b5945dc-klzqh\" (UID: \"130fa4b8-18f1-4427-9b81-4f284f2eaeb8\") " pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.047827 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130fa4b8-18f1-4427-9b81-4f284f2eaeb8-config\") pod \"route-controller-manager-849b5945dc-klzqh\" (UID: \"130fa4b8-18f1-4427-9b81-4f284f2eaeb8\") " pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.047846 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/802c4a57-0a14-457c-a944-50c415f4cb5b-config\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.047867 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/802c4a57-0a14-457c-a944-50c415f4cb5b-serving-cert\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.047896 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/802c4a57-0a14-457c-a944-50c415f4cb5b-proxy-ca-bundles\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.047921 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/130fa4b8-18f1-4427-9b81-4f284f2eaeb8-serving-cert\") pod \"route-controller-manager-849b5945dc-klzqh\" (UID: \"130fa4b8-18f1-4427-9b81-4f284f2eaeb8\") " pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.047939 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgtl8\" (UniqueName: \"kubernetes.io/projected/802c4a57-0a14-457c-a944-50c415f4cb5b-kube-api-access-xgtl8\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.047955 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6878x\" (UniqueName: \"kubernetes.io/projected/130fa4b8-18f1-4427-9b81-4f284f2eaeb8-kube-api-access-6878x\") pod \"route-controller-manager-849b5945dc-klzqh\" (UID: \"130fa4b8-18f1-4427-9b81-4f284f2eaeb8\") " pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.149255 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/130fa4b8-18f1-4427-9b81-4f284f2eaeb8-client-ca\") pod \"route-controller-manager-849b5945dc-klzqh\" (UID: \"130fa4b8-18f1-4427-9b81-4f284f2eaeb8\") " pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.149310 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130fa4b8-18f1-4427-9b81-4f284f2eaeb8-config\") pod \"route-controller-manager-849b5945dc-klzqh\" (UID: \"130fa4b8-18f1-4427-9b81-4f284f2eaeb8\") " pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.149327 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/802c4a57-0a14-457c-a944-50c415f4cb5b-config\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.149348 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/802c4a57-0a14-457c-a944-50c415f4cb5b-serving-cert\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.149385 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/802c4a57-0a14-457c-a944-50c415f4cb5b-proxy-ca-bundles\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.149410 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/130fa4b8-18f1-4427-9b81-4f284f2eaeb8-serving-cert\") pod \"route-controller-manager-849b5945dc-klzqh\" (UID: \"130fa4b8-18f1-4427-9b81-4f284f2eaeb8\") " pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.149426 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgtl8\" (UniqueName: \"kubernetes.io/projected/802c4a57-0a14-457c-a944-50c415f4cb5b-kube-api-access-xgtl8\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.149442 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6878x\" (UniqueName: \"kubernetes.io/projected/130fa4b8-18f1-4427-9b81-4f284f2eaeb8-kube-api-access-6878x\") pod \"route-controller-manager-849b5945dc-klzqh\" (UID: \"130fa4b8-18f1-4427-9b81-4f284f2eaeb8\") " pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.149474 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/802c4a57-0a14-457c-a944-50c415f4cb5b-client-ca\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.150536 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/130fa4b8-18f1-4427-9b81-4f284f2eaeb8-client-ca\") pod \"route-controller-manager-849b5945dc-klzqh\" (UID: \"130fa4b8-18f1-4427-9b81-4f284f2eaeb8\") " pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.150669 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/802c4a57-0a14-457c-a944-50c415f4cb5b-client-ca\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.150674 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/802c4a57-0a14-457c-a944-50c415f4cb5b-config\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.150896 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130fa4b8-18f1-4427-9b81-4f284f2eaeb8-config\") pod \"route-controller-manager-849b5945dc-klzqh\" (UID: \"130fa4b8-18f1-4427-9b81-4f284f2eaeb8\") " pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.151524 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/802c4a57-0a14-457c-a944-50c415f4cb5b-proxy-ca-bundles\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.155084 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/130fa4b8-18f1-4427-9b81-4f284f2eaeb8-serving-cert\") pod \"route-controller-manager-849b5945dc-klzqh\" (UID: \"130fa4b8-18f1-4427-9b81-4f284f2eaeb8\") " pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.169615 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/802c4a57-0a14-457c-a944-50c415f4cb5b-serving-cert\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.171744 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgtl8\" (UniqueName: \"kubernetes.io/projected/802c4a57-0a14-457c-a944-50c415f4cb5b-kube-api-access-xgtl8\") pod \"controller-manager-b65967df5-j6dgt\" (UID: \"802c4a57-0a14-457c-a944-50c415f4cb5b\") " pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.172161 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6878x\" (UniqueName: \"kubernetes.io/projected/130fa4b8-18f1-4427-9b81-4f284f2eaeb8-kube-api-access-6878x\") pod \"route-controller-manager-849b5945dc-klzqh\" (UID: \"130fa4b8-18f1-4427-9b81-4f284f2eaeb8\") " pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.348590 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.348705 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.845488 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh"] Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.881801 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" event={"ID":"130fa4b8-18f1-4427-9b81-4f284f2eaeb8","Type":"ContainerStarted","Data":"746853a050cb2fd8037d0ef75a68b4bed7f8fef8c80c534c95367b1879f46de5"} Nov 26 22:51:16 crc kubenswrapper[5008]: I1126 22:51:16.932530 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b65967df5-j6dgt"] Nov 26 22:51:16 crc kubenswrapper[5008]: W1126 22:51:16.942828 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod802c4a57_0a14_457c_a944_50c415f4cb5b.slice/crio-595c081c330cd13264aa66d3323a42d2be266265273901aafe27297f0bf009a8 WatchSource:0}: Error finding container 595c081c330cd13264aa66d3323a42d2be266265273901aafe27297f0bf009a8: Status 404 returned error can't find the container with id 595c081c330cd13264aa66d3323a42d2be266265273901aafe27297f0bf009a8 Nov 26 22:51:17 crc kubenswrapper[5008]: I1126 22:51:17.888759 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" event={"ID":"130fa4b8-18f1-4427-9b81-4f284f2eaeb8","Type":"ContainerStarted","Data":"803366046eab80a54325368501f17c4960101f6ae0c29726a376458cd511dcc3"} Nov 26 22:51:17 crc kubenswrapper[5008]: I1126 22:51:17.889376 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:17 crc kubenswrapper[5008]: I1126 22:51:17.892412 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" event={"ID":"802c4a57-0a14-457c-a944-50c415f4cb5b","Type":"ContainerStarted","Data":"ffdff4f5c6189133119396a640496ea6ce2d70203ea5c7ac6999b358ffeb7dff"} Nov 26 22:51:17 crc kubenswrapper[5008]: I1126 22:51:17.892573 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" event={"ID":"802c4a57-0a14-457c-a944-50c415f4cb5b","Type":"ContainerStarted","Data":"595c081c330cd13264aa66d3323a42d2be266265273901aafe27297f0bf009a8"} Nov 26 22:51:17 crc kubenswrapper[5008]: I1126 22:51:17.892895 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:17 crc kubenswrapper[5008]: I1126 22:51:17.897187 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" Nov 26 22:51:17 crc kubenswrapper[5008]: I1126 22:51:17.897590 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" Nov 26 22:51:17 crc kubenswrapper[5008]: I1126 22:51:17.913125 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-849b5945dc-klzqh" podStartSLOduration=3.9131063900000003 podStartE2EDuration="3.91310639s" podCreationTimestamp="2025-11-26 22:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:51:17.906484022 +0000 UTC m=+753.319178024" watchObservedRunningTime="2025-11-26 22:51:17.91310639 +0000 UTC m=+753.325800392" Nov 26 22:51:17 crc kubenswrapper[5008]: I1126 22:51:17.948826 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b65967df5-j6dgt" podStartSLOduration=3.948805242 podStartE2EDuration="3.948805242s" podCreationTimestamp="2025-11-26 22:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:51:17.941470841 +0000 UTC m=+753.354164833" watchObservedRunningTime="2025-11-26 22:51:17.948805242 +0000 UTC m=+753.361499244" Nov 26 22:51:18 crc kubenswrapper[5008]: I1126 22:51:18.906489 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-hwwh2" event={"ID":"b41ecab0-6948-40a6-a9df-b4ff781a8122","Type":"ContainerStarted","Data":"7b81b5137f1dbf6ea1101b406b57c6da03f06e03bdd94c0eceb0e277e28d3d60"} Nov 26 22:51:18 crc kubenswrapper[5008]: I1126 22:51:18.907123 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-hwwh2" Nov 26 22:51:18 crc kubenswrapper[5008]: I1126 22:51:18.913139 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l5slf" event={"ID":"9b1a5b0b-c494-499d-9339-83c0bc6dc105","Type":"ContainerStarted","Data":"2ac67933d6860918120626d7fa8f3f93d191bfb24fe923d03951b1973629c07d"} Nov 26 22:51:18 crc kubenswrapper[5008]: I1126 22:51:18.933531 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-hwwh2" podStartSLOduration=1.586957792 podStartE2EDuration="5.933510975s" podCreationTimestamp="2025-11-26 22:51:13 +0000 UTC" firstStartedPulling="2025-11-26 22:51:14.182137026 +0000 UTC m=+749.594831028" lastFinishedPulling="2025-11-26 22:51:18.528690199 +0000 UTC m=+753.941384211" observedRunningTime="2025-11-26 22:51:18.930843271 +0000 UTC m=+754.343537283" watchObservedRunningTime="2025-11-26 22:51:18.933510975 +0000 UTC m=+754.346204987" Nov 26 22:51:19 crc kubenswrapper[5008]: I1126 22:51:19.932331 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-l5slf" Nov 26 22:51:22 crc kubenswrapper[5008]: I1126 22:51:22.807657 5008 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 22:51:22 crc kubenswrapper[5008]: I1126 22:51:22.951879 5008 generic.go:334] "Generic (PLEG): container finished" podID="a0701083-9676-4774-a96e-f6dbe0a67c7b" containerID="11d483d79e486c30dbf081098322bbc4190226c6d5961c31150204a8da995fa0" exitCode=0 Nov 26 22:51:22 crc kubenswrapper[5008]: I1126 22:51:22.951945 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h8srm" event={"ID":"a0701083-9676-4774-a96e-f6dbe0a67c7b","Type":"ContainerDied","Data":"11d483d79e486c30dbf081098322bbc4190226c6d5961c31150204a8da995fa0"} Nov 26 22:51:22 crc kubenswrapper[5008]: I1126 22:51:22.953096 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" event={"ID":"3509ccc3-93df-42aa-9d14-639703b9531e","Type":"ContainerStarted","Data":"74bd2ffcdb831c8ecb7b78756060353ead95e52e015adbc5ae6f5f4b8acf7c93"} Nov 26 22:51:22 crc kubenswrapper[5008]: I1126 22:51:22.953331 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" Nov 26 22:51:22 crc kubenswrapper[5008]: I1126 22:51:22.985575 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-l5slf" podStartSLOduration=6.971762301 podStartE2EDuration="9.985557577s" podCreationTimestamp="2025-11-26 22:51:13 +0000 UTC" firstStartedPulling="2025-11-26 22:51:15.500334195 +0000 UTC m=+750.913028207" lastFinishedPulling="2025-11-26 22:51:18.514129471 +0000 UTC m=+753.926823483" observedRunningTime="2025-11-26 22:51:18.957186078 +0000 UTC m=+754.369880080" watchObservedRunningTime="2025-11-26 22:51:22.985557577 +0000 UTC m=+758.398251589" Nov 26 22:51:23 crc kubenswrapper[5008]: I1126 22:51:23.959817 5008 generic.go:334] "Generic (PLEG): container finished" podID="a0701083-9676-4774-a96e-f6dbe0a67c7b" containerID="ac7dcc8b2012d7f18e8083a85c0ce5010600514ac170779dfe110d7069466529" exitCode=0 Nov 26 22:51:23 crc kubenswrapper[5008]: I1126 22:51:23.959893 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h8srm" event={"ID":"a0701083-9676-4774-a96e-f6dbe0a67c7b","Type":"ContainerDied","Data":"ac7dcc8b2012d7f18e8083a85c0ce5010600514ac170779dfe110d7069466529"} Nov 26 22:51:23 crc kubenswrapper[5008]: I1126 22:51:23.986549 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" podStartSLOduration=3.058848701 podStartE2EDuration="10.98652683s" podCreationTimestamp="2025-11-26 22:51:13 +0000 UTC" firstStartedPulling="2025-11-26 22:51:14.563274379 +0000 UTC m=+749.975968371" lastFinishedPulling="2025-11-26 22:51:22.490952488 +0000 UTC m=+757.903646500" observedRunningTime="2025-11-26 22:51:22.993269589 +0000 UTC m=+758.405963601" watchObservedRunningTime="2025-11-26 22:51:23.98652683 +0000 UTC m=+759.399220832" Nov 26 22:51:24 crc kubenswrapper[5008]: I1126 22:51:24.969580 5008 generic.go:334] "Generic (PLEG): container finished" podID="a0701083-9676-4774-a96e-f6dbe0a67c7b" containerID="530c8fd766d732c7397fbda659096f607ff6b9953290f144a6b3de2ef2e8c920" exitCode=0 Nov 26 22:51:24 crc kubenswrapper[5008]: I1126 22:51:24.969704 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h8srm" event={"ID":"a0701083-9676-4774-a96e-f6dbe0a67c7b","Type":"ContainerDied","Data":"530c8fd766d732c7397fbda659096f607ff6b9953290f144a6b3de2ef2e8c920"} Nov 26 22:51:25 crc kubenswrapper[5008]: I1126 22:51:25.250766 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-l5slf" Nov 26 22:51:25 crc kubenswrapper[5008]: I1126 22:51:25.980585 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h8srm" event={"ID":"a0701083-9676-4774-a96e-f6dbe0a67c7b","Type":"ContainerStarted","Data":"de94e893f62d2918de488981960e825bd97ddd7a3e0c1002cd8e7b4c35aa5e2a"} Nov 26 22:51:25 crc kubenswrapper[5008]: I1126 22:51:25.980832 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h8srm" event={"ID":"a0701083-9676-4774-a96e-f6dbe0a67c7b","Type":"ContainerStarted","Data":"3b5f4c5c2f89cf5dc516af1f27922370414b089a3bdb59d7092bb91592eae7f5"} Nov 26 22:51:25 crc kubenswrapper[5008]: I1126 22:51:25.980841 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h8srm" event={"ID":"a0701083-9676-4774-a96e-f6dbe0a67c7b","Type":"ContainerStarted","Data":"af19c1932d1b24aa93895b4ad56b1c6f4f2234c03ee8c0cacc6f5b236d1c0881"} Nov 26 22:51:25 crc kubenswrapper[5008]: I1126 22:51:25.980849 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h8srm" event={"ID":"a0701083-9676-4774-a96e-f6dbe0a67c7b","Type":"ContainerStarted","Data":"dda60f8a35e45f11de5ade0234ebd8a61c1cdf66958d81c3b92cc0fbac590993"} Nov 26 22:51:25 crc kubenswrapper[5008]: I1126 22:51:25.980857 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h8srm" event={"ID":"a0701083-9676-4774-a96e-f6dbe0a67c7b","Type":"ContainerStarted","Data":"88a88e6cdc6eea67187e2fb7f5b4fa79dc0a69f1af156c0dda2ba1d793d5f726"} Nov 26 22:51:26 crc kubenswrapper[5008]: I1126 22:51:26.992346 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h8srm" event={"ID":"a0701083-9676-4774-a96e-f6dbe0a67c7b","Type":"ContainerStarted","Data":"a44732b63c833c72724c6043f244a02621a50695502e275439fea85371b0c01f"} Nov 26 22:51:26 crc kubenswrapper[5008]: I1126 22:51:26.992807 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:27 crc kubenswrapper[5008]: I1126 22:51:27.010875 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-h8srm" podStartSLOduration=5.30971934 podStartE2EDuration="14.010862037s" podCreationTimestamp="2025-11-26 22:51:13 +0000 UTC" firstStartedPulling="2025-11-26 22:51:13.774715078 +0000 UTC m=+749.187409090" lastFinishedPulling="2025-11-26 22:51:22.475857785 +0000 UTC m=+757.888551787" observedRunningTime="2025-11-26 22:51:27.009841815 +0000 UTC m=+762.422535857" watchObservedRunningTime="2025-11-26 22:51:27.010862037 +0000 UTC m=+762.423556039" Nov 26 22:51:28 crc kubenswrapper[5008]: I1126 22:51:28.668756 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:28 crc kubenswrapper[5008]: I1126 22:51:28.711884 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:29 crc kubenswrapper[5008]: I1126 22:51:29.280733 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:51:29 crc kubenswrapper[5008]: I1126 22:51:29.280791 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:51:31 crc kubenswrapper[5008]: I1126 22:51:31.500775 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-84d78"] Nov 26 22:51:31 crc kubenswrapper[5008]: I1126 22:51:31.502659 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-84d78" Nov 26 22:51:31 crc kubenswrapper[5008]: I1126 22:51:31.508566 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 26 22:51:31 crc kubenswrapper[5008]: I1126 22:51:31.509307 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-84d78"] Nov 26 22:51:31 crc kubenswrapper[5008]: I1126 22:51:31.509771 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 26 22:51:31 crc kubenswrapper[5008]: I1126 22:51:31.517258 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-hzbrd" Nov 26 22:51:31 crc kubenswrapper[5008]: I1126 22:51:31.589132 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7284p\" (UniqueName: \"kubernetes.io/projected/4e8c4fd4-3a1d-4859-84e2-a09635e076f6-kube-api-access-7284p\") pod \"mariadb-operator-index-84d78\" (UID: \"4e8c4fd4-3a1d-4859-84e2-a09635e076f6\") " pod="openstack-operators/mariadb-operator-index-84d78" Nov 26 22:51:31 crc kubenswrapper[5008]: I1126 22:51:31.690475 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7284p\" (UniqueName: \"kubernetes.io/projected/4e8c4fd4-3a1d-4859-84e2-a09635e076f6-kube-api-access-7284p\") pod \"mariadb-operator-index-84d78\" (UID: \"4e8c4fd4-3a1d-4859-84e2-a09635e076f6\") " pod="openstack-operators/mariadb-operator-index-84d78" Nov 26 22:51:31 crc kubenswrapper[5008]: I1126 22:51:31.721335 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7284p\" (UniqueName: \"kubernetes.io/projected/4e8c4fd4-3a1d-4859-84e2-a09635e076f6-kube-api-access-7284p\") pod \"mariadb-operator-index-84d78\" (UID: \"4e8c4fd4-3a1d-4859-84e2-a09635e076f6\") " pod="openstack-operators/mariadb-operator-index-84d78" Nov 26 22:51:31 crc kubenswrapper[5008]: I1126 22:51:31.836489 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-84d78" Nov 26 22:51:32 crc kubenswrapper[5008]: I1126 22:51:32.317222 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-84d78"] Nov 26 22:51:32 crc kubenswrapper[5008]: W1126 22:51:32.323586 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e8c4fd4_3a1d_4859_84e2_a09635e076f6.slice/crio-d7f87b64fad5e82a6958c163d01ca2ada4844ffacd727e39f20a6cb6ee5e7a40 WatchSource:0}: Error finding container d7f87b64fad5e82a6958c163d01ca2ada4844ffacd727e39f20a6cb6ee5e7a40: Status 404 returned error can't find the container with id d7f87b64fad5e82a6958c163d01ca2ada4844ffacd727e39f20a6cb6ee5e7a40 Nov 26 22:51:33 crc kubenswrapper[5008]: I1126 22:51:33.037929 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-84d78" event={"ID":"4e8c4fd4-3a1d-4859-84e2-a09635e076f6","Type":"ContainerStarted","Data":"d7f87b64fad5e82a6958c163d01ca2ada4844ffacd727e39f20a6cb6ee5e7a40"} Nov 26 22:51:33 crc kubenswrapper[5008]: I1126 22:51:33.765227 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-hwwh2" Nov 26 22:51:34 crc kubenswrapper[5008]: I1126 22:51:34.049050 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-84d78" event={"ID":"4e8c4fd4-3a1d-4859-84e2-a09635e076f6","Type":"ContainerStarted","Data":"3bdf144d57b40232a269cf3e3c294d0211ce4dd45b82900a2e1e361d21c4205c"} Nov 26 22:51:34 crc kubenswrapper[5008]: I1126 22:51:34.066636 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-84d78" podStartSLOduration=2.010462418 podStartE2EDuration="3.066616546s" podCreationTimestamp="2025-11-26 22:51:31 +0000 UTC" firstStartedPulling="2025-11-26 22:51:32.326292076 +0000 UTC m=+767.738986108" lastFinishedPulling="2025-11-26 22:51:33.382446194 +0000 UTC m=+768.795140236" observedRunningTime="2025-11-26 22:51:34.064843851 +0000 UTC m=+769.477537853" watchObservedRunningTime="2025-11-26 22:51:34.066616546 +0000 UTC m=+769.479310558" Nov 26 22:51:34 crc kubenswrapper[5008]: I1126 22:51:34.270124 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-hnx8h" Nov 26 22:51:34 crc kubenswrapper[5008]: I1126 22:51:34.839283 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-84d78"] Nov 26 22:51:35 crc kubenswrapper[5008]: I1126 22:51:35.456838 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-gnv67"] Nov 26 22:51:35 crc kubenswrapper[5008]: I1126 22:51:35.457957 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-gnv67" Nov 26 22:51:35 crc kubenswrapper[5008]: I1126 22:51:35.471044 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-gnv67"] Nov 26 22:51:35 crc kubenswrapper[5008]: I1126 22:51:35.551537 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn4x8\" (UniqueName: \"kubernetes.io/projected/3e94d1e2-a280-4ec5-9430-4166cc88e243-kube-api-access-pn4x8\") pod \"mariadb-operator-index-gnv67\" (UID: \"3e94d1e2-a280-4ec5-9430-4166cc88e243\") " pod="openstack-operators/mariadb-operator-index-gnv67" Nov 26 22:51:35 crc kubenswrapper[5008]: I1126 22:51:35.653572 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn4x8\" (UniqueName: \"kubernetes.io/projected/3e94d1e2-a280-4ec5-9430-4166cc88e243-kube-api-access-pn4x8\") pod \"mariadb-operator-index-gnv67\" (UID: \"3e94d1e2-a280-4ec5-9430-4166cc88e243\") " pod="openstack-operators/mariadb-operator-index-gnv67" Nov 26 22:51:35 crc kubenswrapper[5008]: I1126 22:51:35.675183 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn4x8\" (UniqueName: \"kubernetes.io/projected/3e94d1e2-a280-4ec5-9430-4166cc88e243-kube-api-access-pn4x8\") pod \"mariadb-operator-index-gnv67\" (UID: \"3e94d1e2-a280-4ec5-9430-4166cc88e243\") " pod="openstack-operators/mariadb-operator-index-gnv67" Nov 26 22:51:35 crc kubenswrapper[5008]: I1126 22:51:35.790446 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-gnv67" Nov 26 22:51:36 crc kubenswrapper[5008]: I1126 22:51:36.065866 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-84d78" podUID="4e8c4fd4-3a1d-4859-84e2-a09635e076f6" containerName="registry-server" containerID="cri-o://3bdf144d57b40232a269cf3e3c294d0211ce4dd45b82900a2e1e361d21c4205c" gracePeriod=2 Nov 26 22:51:36 crc kubenswrapper[5008]: I1126 22:51:36.297166 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-gnv67"] Nov 26 22:51:36 crc kubenswrapper[5008]: W1126 22:51:36.311002 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e94d1e2_a280_4ec5_9430_4166cc88e243.slice/crio-76ea66d685c1653c20e04bae337d80565c39e328ef4118b1d418bee3e88b8b2e WatchSource:0}: Error finding container 76ea66d685c1653c20e04bae337d80565c39e328ef4118b1d418bee3e88b8b2e: Status 404 returned error can't find the container with id 76ea66d685c1653c20e04bae337d80565c39e328ef4118b1d418bee3e88b8b2e Nov 26 22:51:36 crc kubenswrapper[5008]: I1126 22:51:36.504088 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-84d78" Nov 26 22:51:36 crc kubenswrapper[5008]: I1126 22:51:36.567878 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7284p\" (UniqueName: \"kubernetes.io/projected/4e8c4fd4-3a1d-4859-84e2-a09635e076f6-kube-api-access-7284p\") pod \"4e8c4fd4-3a1d-4859-84e2-a09635e076f6\" (UID: \"4e8c4fd4-3a1d-4859-84e2-a09635e076f6\") " Nov 26 22:51:36 crc kubenswrapper[5008]: I1126 22:51:36.576313 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e8c4fd4-3a1d-4859-84e2-a09635e076f6-kube-api-access-7284p" (OuterVolumeSpecName: "kube-api-access-7284p") pod "4e8c4fd4-3a1d-4859-84e2-a09635e076f6" (UID: "4e8c4fd4-3a1d-4859-84e2-a09635e076f6"). InnerVolumeSpecName "kube-api-access-7284p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:51:36 crc kubenswrapper[5008]: I1126 22:51:36.669735 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7284p\" (UniqueName: \"kubernetes.io/projected/4e8c4fd4-3a1d-4859-84e2-a09635e076f6-kube-api-access-7284p\") on node \"crc\" DevicePath \"\"" Nov 26 22:51:37 crc kubenswrapper[5008]: I1126 22:51:37.075492 5008 generic.go:334] "Generic (PLEG): container finished" podID="4e8c4fd4-3a1d-4859-84e2-a09635e076f6" containerID="3bdf144d57b40232a269cf3e3c294d0211ce4dd45b82900a2e1e361d21c4205c" exitCode=0 Nov 26 22:51:37 crc kubenswrapper[5008]: I1126 22:51:37.075540 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-84d78" Nov 26 22:51:37 crc kubenswrapper[5008]: I1126 22:51:37.075591 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-84d78" event={"ID":"4e8c4fd4-3a1d-4859-84e2-a09635e076f6","Type":"ContainerDied","Data":"3bdf144d57b40232a269cf3e3c294d0211ce4dd45b82900a2e1e361d21c4205c"} Nov 26 22:51:37 crc kubenswrapper[5008]: I1126 22:51:37.076049 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-84d78" event={"ID":"4e8c4fd4-3a1d-4859-84e2-a09635e076f6","Type":"ContainerDied","Data":"d7f87b64fad5e82a6958c163d01ca2ada4844ffacd727e39f20a6cb6ee5e7a40"} Nov 26 22:51:37 crc kubenswrapper[5008]: I1126 22:51:37.076096 5008 scope.go:117] "RemoveContainer" containerID="3bdf144d57b40232a269cf3e3c294d0211ce4dd45b82900a2e1e361d21c4205c" Nov 26 22:51:37 crc kubenswrapper[5008]: I1126 22:51:37.081701 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-gnv67" event={"ID":"3e94d1e2-a280-4ec5-9430-4166cc88e243","Type":"ContainerStarted","Data":"92ce7115ff0ed86d23cdb25863b61ea5fc4482851b0c4aff7a09e71727fdebc7"} Nov 26 22:51:37 crc kubenswrapper[5008]: I1126 22:51:37.081748 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-gnv67" event={"ID":"3e94d1e2-a280-4ec5-9430-4166cc88e243","Type":"ContainerStarted","Data":"76ea66d685c1653c20e04bae337d80565c39e328ef4118b1d418bee3e88b8b2e"} Nov 26 22:51:37 crc kubenswrapper[5008]: I1126 22:51:37.107910 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-gnv67" podStartSLOduration=1.661043828 podStartE2EDuration="2.107888175s" podCreationTimestamp="2025-11-26 22:51:35 +0000 UTC" firstStartedPulling="2025-11-26 22:51:36.319153038 +0000 UTC m=+771.731847050" lastFinishedPulling="2025-11-26 22:51:36.765997395 +0000 UTC m=+772.178691397" observedRunningTime="2025-11-26 22:51:37.107637897 +0000 UTC m=+772.520331949" watchObservedRunningTime="2025-11-26 22:51:37.107888175 +0000 UTC m=+772.520582217" Nov 26 22:51:37 crc kubenswrapper[5008]: I1126 22:51:37.112214 5008 scope.go:117] "RemoveContainer" containerID="3bdf144d57b40232a269cf3e3c294d0211ce4dd45b82900a2e1e361d21c4205c" Nov 26 22:51:37 crc kubenswrapper[5008]: E1126 22:51:37.113001 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bdf144d57b40232a269cf3e3c294d0211ce4dd45b82900a2e1e361d21c4205c\": container with ID starting with 3bdf144d57b40232a269cf3e3c294d0211ce4dd45b82900a2e1e361d21c4205c not found: ID does not exist" containerID="3bdf144d57b40232a269cf3e3c294d0211ce4dd45b82900a2e1e361d21c4205c" Nov 26 22:51:37 crc kubenswrapper[5008]: I1126 22:51:37.113102 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bdf144d57b40232a269cf3e3c294d0211ce4dd45b82900a2e1e361d21c4205c"} err="failed to get container status \"3bdf144d57b40232a269cf3e3c294d0211ce4dd45b82900a2e1e361d21c4205c\": rpc error: code = NotFound desc = could not find container \"3bdf144d57b40232a269cf3e3c294d0211ce4dd45b82900a2e1e361d21c4205c\": container with ID starting with 3bdf144d57b40232a269cf3e3c294d0211ce4dd45b82900a2e1e361d21c4205c not found: ID does not exist" Nov 26 22:51:37 crc kubenswrapper[5008]: I1126 22:51:37.125645 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-84d78"] Nov 26 22:51:37 crc kubenswrapper[5008]: I1126 22:51:37.129791 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-84d78"] Nov 26 22:51:37 crc kubenswrapper[5008]: I1126 22:51:37.527142 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e8c4fd4-3a1d-4859-84e2-a09635e076f6" path="/var/lib/kubelet/pods/4e8c4fd4-3a1d-4859-84e2-a09635e076f6/volumes" Nov 26 22:51:43 crc kubenswrapper[5008]: I1126 22:51:43.674499 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-h8srm" Nov 26 22:51:45 crc kubenswrapper[5008]: I1126 22:51:45.791408 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-gnv67" Nov 26 22:51:45 crc kubenswrapper[5008]: I1126 22:51:45.791674 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-gnv67" Nov 26 22:51:45 crc kubenswrapper[5008]: I1126 22:51:45.840768 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-gnv67" Nov 26 22:51:46 crc kubenswrapper[5008]: I1126 22:51:46.198117 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-gnv67" Nov 26 22:51:46 crc kubenswrapper[5008]: I1126 22:51:46.892456 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw"] Nov 26 22:51:46 crc kubenswrapper[5008]: E1126 22:51:46.892950 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e8c4fd4-3a1d-4859-84e2-a09635e076f6" containerName="registry-server" Nov 26 22:51:46 crc kubenswrapper[5008]: I1126 22:51:46.892983 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e8c4fd4-3a1d-4859-84e2-a09635e076f6" containerName="registry-server" Nov 26 22:51:46 crc kubenswrapper[5008]: I1126 22:51:46.893152 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e8c4fd4-3a1d-4859-84e2-a09635e076f6" containerName="registry-server" Nov 26 22:51:46 crc kubenswrapper[5008]: I1126 22:51:46.896509 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" Nov 26 22:51:46 crc kubenswrapper[5008]: I1126 22:51:46.899381 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6kdt6" Nov 26 22:51:46 crc kubenswrapper[5008]: I1126 22:51:46.903882 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw"] Nov 26 22:51:47 crc kubenswrapper[5008]: I1126 22:51:47.038635 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht5b6\" (UniqueName: \"kubernetes.io/projected/79282a86-0bbe-46da-8269-19ee801ab580-kube-api-access-ht5b6\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw\" (UID: \"79282a86-0bbe-46da-8269-19ee801ab580\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" Nov 26 22:51:47 crc kubenswrapper[5008]: I1126 22:51:47.038729 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79282a86-0bbe-46da-8269-19ee801ab580-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw\" (UID: \"79282a86-0bbe-46da-8269-19ee801ab580\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" Nov 26 22:51:47 crc kubenswrapper[5008]: I1126 22:51:47.038819 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79282a86-0bbe-46da-8269-19ee801ab580-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw\" (UID: \"79282a86-0bbe-46da-8269-19ee801ab580\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" Nov 26 22:51:47 crc kubenswrapper[5008]: I1126 22:51:47.140386 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht5b6\" (UniqueName: \"kubernetes.io/projected/79282a86-0bbe-46da-8269-19ee801ab580-kube-api-access-ht5b6\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw\" (UID: \"79282a86-0bbe-46da-8269-19ee801ab580\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" Nov 26 22:51:47 crc kubenswrapper[5008]: I1126 22:51:47.140473 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79282a86-0bbe-46da-8269-19ee801ab580-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw\" (UID: \"79282a86-0bbe-46da-8269-19ee801ab580\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" Nov 26 22:51:47 crc kubenswrapper[5008]: I1126 22:51:47.140564 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79282a86-0bbe-46da-8269-19ee801ab580-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw\" (UID: \"79282a86-0bbe-46da-8269-19ee801ab580\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" Nov 26 22:51:47 crc kubenswrapper[5008]: I1126 22:51:47.141287 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79282a86-0bbe-46da-8269-19ee801ab580-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw\" (UID: \"79282a86-0bbe-46da-8269-19ee801ab580\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" Nov 26 22:51:47 crc kubenswrapper[5008]: I1126 22:51:47.141505 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79282a86-0bbe-46da-8269-19ee801ab580-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw\" (UID: \"79282a86-0bbe-46da-8269-19ee801ab580\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" Nov 26 22:51:47 crc kubenswrapper[5008]: I1126 22:51:47.181806 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht5b6\" (UniqueName: \"kubernetes.io/projected/79282a86-0bbe-46da-8269-19ee801ab580-kube-api-access-ht5b6\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw\" (UID: \"79282a86-0bbe-46da-8269-19ee801ab580\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" Nov 26 22:51:47 crc kubenswrapper[5008]: I1126 22:51:47.212592 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" Nov 26 22:51:47 crc kubenswrapper[5008]: I1126 22:51:47.745917 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw"] Nov 26 22:51:47 crc kubenswrapper[5008]: W1126 22:51:47.752357 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79282a86_0bbe_46da_8269_19ee801ab580.slice/crio-975eea16432330a2774c96d4cf3c8edbb42db3defa837957eded780c3ece0f3a WatchSource:0}: Error finding container 975eea16432330a2774c96d4cf3c8edbb42db3defa837957eded780c3ece0f3a: Status 404 returned error can't find the container with id 975eea16432330a2774c96d4cf3c8edbb42db3defa837957eded780c3ece0f3a Nov 26 22:51:48 crc kubenswrapper[5008]: I1126 22:51:48.167409 5008 generic.go:334] "Generic (PLEG): container finished" podID="79282a86-0bbe-46da-8269-19ee801ab580" containerID="9b56d7b7f095213d79847e854306d12520dfd201bc8864a527a8674cd1e0469b" exitCode=0 Nov 26 22:51:48 crc kubenswrapper[5008]: I1126 22:51:48.167479 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" event={"ID":"79282a86-0bbe-46da-8269-19ee801ab580","Type":"ContainerDied","Data":"9b56d7b7f095213d79847e854306d12520dfd201bc8864a527a8674cd1e0469b"} Nov 26 22:51:48 crc kubenswrapper[5008]: I1126 22:51:48.167519 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" event={"ID":"79282a86-0bbe-46da-8269-19ee801ab580","Type":"ContainerStarted","Data":"975eea16432330a2774c96d4cf3c8edbb42db3defa837957eded780c3ece0f3a"} Nov 26 22:51:49 crc kubenswrapper[5008]: I1126 22:51:49.177258 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" event={"ID":"79282a86-0bbe-46da-8269-19ee801ab580","Type":"ContainerStarted","Data":"1dd77856d2e56eb125453b4586a8a70137d6bb1ec365af5ae4f7e15feab306aa"} Nov 26 22:51:50 crc kubenswrapper[5008]: I1126 22:51:50.188828 5008 generic.go:334] "Generic (PLEG): container finished" podID="79282a86-0bbe-46da-8269-19ee801ab580" containerID="1dd77856d2e56eb125453b4586a8a70137d6bb1ec365af5ae4f7e15feab306aa" exitCode=0 Nov 26 22:51:50 crc kubenswrapper[5008]: I1126 22:51:50.189016 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" event={"ID":"79282a86-0bbe-46da-8269-19ee801ab580","Type":"ContainerDied","Data":"1dd77856d2e56eb125453b4586a8a70137d6bb1ec365af5ae4f7e15feab306aa"} Nov 26 22:51:51 crc kubenswrapper[5008]: I1126 22:51:51.201011 5008 generic.go:334] "Generic (PLEG): container finished" podID="79282a86-0bbe-46da-8269-19ee801ab580" containerID="c8e0a3a7d1081d1e1115ea5e0ca02edb208945422f6063d7afc9883180536ffb" exitCode=0 Nov 26 22:51:51 crc kubenswrapper[5008]: I1126 22:51:51.201072 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" event={"ID":"79282a86-0bbe-46da-8269-19ee801ab580","Type":"ContainerDied","Data":"c8e0a3a7d1081d1e1115ea5e0ca02edb208945422f6063d7afc9883180536ffb"} Nov 26 22:51:52 crc kubenswrapper[5008]: I1126 22:51:52.634535 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" Nov 26 22:51:52 crc kubenswrapper[5008]: I1126 22:51:52.828864 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79282a86-0bbe-46da-8269-19ee801ab580-bundle\") pod \"79282a86-0bbe-46da-8269-19ee801ab580\" (UID: \"79282a86-0bbe-46da-8269-19ee801ab580\") " Nov 26 22:51:52 crc kubenswrapper[5008]: I1126 22:51:52.828948 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79282a86-0bbe-46da-8269-19ee801ab580-util\") pod \"79282a86-0bbe-46da-8269-19ee801ab580\" (UID: \"79282a86-0bbe-46da-8269-19ee801ab580\") " Nov 26 22:51:52 crc kubenswrapper[5008]: I1126 22:51:52.829165 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht5b6\" (UniqueName: \"kubernetes.io/projected/79282a86-0bbe-46da-8269-19ee801ab580-kube-api-access-ht5b6\") pod \"79282a86-0bbe-46da-8269-19ee801ab580\" (UID: \"79282a86-0bbe-46da-8269-19ee801ab580\") " Nov 26 22:51:52 crc kubenswrapper[5008]: I1126 22:51:52.830267 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79282a86-0bbe-46da-8269-19ee801ab580-bundle" (OuterVolumeSpecName: "bundle") pod "79282a86-0bbe-46da-8269-19ee801ab580" (UID: "79282a86-0bbe-46da-8269-19ee801ab580"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:51:52 crc kubenswrapper[5008]: I1126 22:51:52.837768 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79282a86-0bbe-46da-8269-19ee801ab580-kube-api-access-ht5b6" (OuterVolumeSpecName: "kube-api-access-ht5b6") pod "79282a86-0bbe-46da-8269-19ee801ab580" (UID: "79282a86-0bbe-46da-8269-19ee801ab580"). InnerVolumeSpecName "kube-api-access-ht5b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:51:52 crc kubenswrapper[5008]: I1126 22:51:52.863300 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79282a86-0bbe-46da-8269-19ee801ab580-util" (OuterVolumeSpecName: "util") pod "79282a86-0bbe-46da-8269-19ee801ab580" (UID: "79282a86-0bbe-46da-8269-19ee801ab580"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:51:52 crc kubenswrapper[5008]: I1126 22:51:52.930462 5008 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79282a86-0bbe-46da-8269-19ee801ab580-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:51:52 crc kubenswrapper[5008]: I1126 22:51:52.930501 5008 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79282a86-0bbe-46da-8269-19ee801ab580-util\") on node \"crc\" DevicePath \"\"" Nov 26 22:51:52 crc kubenswrapper[5008]: I1126 22:51:52.930514 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht5b6\" (UniqueName: \"kubernetes.io/projected/79282a86-0bbe-46da-8269-19ee801ab580-kube-api-access-ht5b6\") on node \"crc\" DevicePath \"\"" Nov 26 22:51:53 crc kubenswrapper[5008]: I1126 22:51:53.219819 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" event={"ID":"79282a86-0bbe-46da-8269-19ee801ab580","Type":"ContainerDied","Data":"975eea16432330a2774c96d4cf3c8edbb42db3defa837957eded780c3ece0f3a"} Nov 26 22:51:53 crc kubenswrapper[5008]: I1126 22:51:53.219892 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="975eea16432330a2774c96d4cf3c8edbb42db3defa837957eded780c3ece0f3a" Nov 26 22:51:53 crc kubenswrapper[5008]: I1126 22:51:53.219919 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fmnjbw" Nov 26 22:51:59 crc kubenswrapper[5008]: I1126 22:51:59.280857 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:51:59 crc kubenswrapper[5008]: I1126 22:51:59.281324 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:51:59 crc kubenswrapper[5008]: I1126 22:51:59.281369 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:51:59 crc kubenswrapper[5008]: I1126 22:51:59.281917 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52afabcce87945b7bdb95fab76560c9840f1056b4d5ad19b5a36a6c1c1f5fb46"} pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 22:51:59 crc kubenswrapper[5008]: I1126 22:51:59.281992 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" containerID="cri-o://52afabcce87945b7bdb95fab76560c9840f1056b4d5ad19b5a36a6c1c1f5fb46" gracePeriod=600 Nov 26 22:52:00 crc kubenswrapper[5008]: I1126 22:52:00.274545 5008 generic.go:334] "Generic (PLEG): container finished" podID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerID="52afabcce87945b7bdb95fab76560c9840f1056b4d5ad19b5a36a6c1c1f5fb46" exitCode=0 Nov 26 22:52:00 crc kubenswrapper[5008]: I1126 22:52:00.274624 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerDied","Data":"52afabcce87945b7bdb95fab76560c9840f1056b4d5ad19b5a36a6c1c1f5fb46"} Nov 26 22:52:00 crc kubenswrapper[5008]: I1126 22:52:00.275048 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerStarted","Data":"7996ba3586721dd9a32b1864cf2a2c0cd8e592ad6992d5bba7e5d7e532bd504c"} Nov 26 22:52:00 crc kubenswrapper[5008]: I1126 22:52:00.275085 5008 scope.go:117] "RemoveContainer" containerID="983d90254c72d506a65b1c3d88199e1f5ac7b864864378d8067f3493c3b9ada7" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.253749 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm"] Nov 26 22:52:01 crc kubenswrapper[5008]: E1126 22:52:01.254590 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79282a86-0bbe-46da-8269-19ee801ab580" containerName="pull" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.254623 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="79282a86-0bbe-46da-8269-19ee801ab580" containerName="pull" Nov 26 22:52:01 crc kubenswrapper[5008]: E1126 22:52:01.254655 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79282a86-0bbe-46da-8269-19ee801ab580" containerName="extract" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.254671 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="79282a86-0bbe-46da-8269-19ee801ab580" containerName="extract" Nov 26 22:52:01 crc kubenswrapper[5008]: E1126 22:52:01.254717 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79282a86-0bbe-46da-8269-19ee801ab580" containerName="util" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.254733 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="79282a86-0bbe-46da-8269-19ee801ab580" containerName="util" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.254948 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="79282a86-0bbe-46da-8269-19ee801ab580" containerName="extract" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.255750 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.259433 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.259873 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-x2qpv" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.260094 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.275945 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm"] Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.453659 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a593559a-2caa-41b9-86bd-5f290b91f6ae-apiservice-cert\") pod \"mariadb-operator-controller-manager-75b97bfb54-shcmm\" (UID: \"a593559a-2caa-41b9-86bd-5f290b91f6ae\") " pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.453702 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a593559a-2caa-41b9-86bd-5f290b91f6ae-webhook-cert\") pod \"mariadb-operator-controller-manager-75b97bfb54-shcmm\" (UID: \"a593559a-2caa-41b9-86bd-5f290b91f6ae\") " pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.453732 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhn9s\" (UniqueName: \"kubernetes.io/projected/a593559a-2caa-41b9-86bd-5f290b91f6ae-kube-api-access-fhn9s\") pod \"mariadb-operator-controller-manager-75b97bfb54-shcmm\" (UID: \"a593559a-2caa-41b9-86bd-5f290b91f6ae\") " pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.555957 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a593559a-2caa-41b9-86bd-5f290b91f6ae-apiservice-cert\") pod \"mariadb-operator-controller-manager-75b97bfb54-shcmm\" (UID: \"a593559a-2caa-41b9-86bd-5f290b91f6ae\") " pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.556095 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a593559a-2caa-41b9-86bd-5f290b91f6ae-webhook-cert\") pod \"mariadb-operator-controller-manager-75b97bfb54-shcmm\" (UID: \"a593559a-2caa-41b9-86bd-5f290b91f6ae\") " pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.556163 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhn9s\" (UniqueName: \"kubernetes.io/projected/a593559a-2caa-41b9-86bd-5f290b91f6ae-kube-api-access-fhn9s\") pod \"mariadb-operator-controller-manager-75b97bfb54-shcmm\" (UID: \"a593559a-2caa-41b9-86bd-5f290b91f6ae\") " pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.565614 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a593559a-2caa-41b9-86bd-5f290b91f6ae-webhook-cert\") pod \"mariadb-operator-controller-manager-75b97bfb54-shcmm\" (UID: \"a593559a-2caa-41b9-86bd-5f290b91f6ae\") " pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.573980 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhn9s\" (UniqueName: \"kubernetes.io/projected/a593559a-2caa-41b9-86bd-5f290b91f6ae-kube-api-access-fhn9s\") pod \"mariadb-operator-controller-manager-75b97bfb54-shcmm\" (UID: \"a593559a-2caa-41b9-86bd-5f290b91f6ae\") " pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.577429 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a593559a-2caa-41b9-86bd-5f290b91f6ae-apiservice-cert\") pod \"mariadb-operator-controller-manager-75b97bfb54-shcmm\" (UID: \"a593559a-2caa-41b9-86bd-5f290b91f6ae\") " pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 22:52:01 crc kubenswrapper[5008]: I1126 22:52:01.622713 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 22:52:02 crc kubenswrapper[5008]: I1126 22:52:02.106532 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm"] Nov 26 22:52:02 crc kubenswrapper[5008]: W1126 22:52:02.120674 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda593559a_2caa_41b9_86bd_5f290b91f6ae.slice/crio-9697710db24b5068d68a51cb1c122040134a49a24bb6021ce31dafc566a7b467 WatchSource:0}: Error finding container 9697710db24b5068d68a51cb1c122040134a49a24bb6021ce31dafc566a7b467: Status 404 returned error can't find the container with id 9697710db24b5068d68a51cb1c122040134a49a24bb6021ce31dafc566a7b467 Nov 26 22:52:02 crc kubenswrapper[5008]: I1126 22:52:02.293923 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" event={"ID":"a593559a-2caa-41b9-86bd-5f290b91f6ae","Type":"ContainerStarted","Data":"9697710db24b5068d68a51cb1c122040134a49a24bb6021ce31dafc566a7b467"} Nov 26 22:52:06 crc kubenswrapper[5008]: I1126 22:52:06.325982 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" event={"ID":"a593559a-2caa-41b9-86bd-5f290b91f6ae","Type":"ContainerStarted","Data":"a28909963a1540ccbf3b5c829e0936657a00b7d479af521835f778af184fc017"} Nov 26 22:52:06 crc kubenswrapper[5008]: I1126 22:52:06.326593 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 22:52:06 crc kubenswrapper[5008]: I1126 22:52:06.354770 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" podStartSLOduration=1.9422232400000001 podStartE2EDuration="5.354747666s" podCreationTimestamp="2025-11-26 22:52:01 +0000 UTC" firstStartedPulling="2025-11-26 22:52:02.124852374 +0000 UTC m=+797.537546386" lastFinishedPulling="2025-11-26 22:52:05.53737681 +0000 UTC m=+800.950070812" observedRunningTime="2025-11-26 22:52:06.349060727 +0000 UTC m=+801.761754749" watchObservedRunningTime="2025-11-26 22:52:06.354747666 +0000 UTC m=+801.767441678" Nov 26 22:52:11 crc kubenswrapper[5008]: I1126 22:52:11.630624 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 22:52:14 crc kubenswrapper[5008]: I1126 22:52:14.928539 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-dpb2h"] Nov 26 22:52:14 crc kubenswrapper[5008]: I1126 22:52:14.929944 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-dpb2h" Nov 26 22:52:14 crc kubenswrapper[5008]: I1126 22:52:14.933165 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-67b7t" Nov 26 22:52:14 crc kubenswrapper[5008]: I1126 22:52:14.951652 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-dpb2h"] Nov 26 22:52:14 crc kubenswrapper[5008]: I1126 22:52:14.980716 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmf2b\" (UniqueName: \"kubernetes.io/projected/2d9ded7f-f632-43d6-bcc8-b6fbbd477435-kube-api-access-tmf2b\") pod \"infra-operator-index-dpb2h\" (UID: \"2d9ded7f-f632-43d6-bcc8-b6fbbd477435\") " pod="openstack-operators/infra-operator-index-dpb2h" Nov 26 22:52:15 crc kubenswrapper[5008]: I1126 22:52:15.082559 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmf2b\" (UniqueName: \"kubernetes.io/projected/2d9ded7f-f632-43d6-bcc8-b6fbbd477435-kube-api-access-tmf2b\") pod \"infra-operator-index-dpb2h\" (UID: \"2d9ded7f-f632-43d6-bcc8-b6fbbd477435\") " pod="openstack-operators/infra-operator-index-dpb2h" Nov 26 22:52:15 crc kubenswrapper[5008]: I1126 22:52:15.112715 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmf2b\" (UniqueName: \"kubernetes.io/projected/2d9ded7f-f632-43d6-bcc8-b6fbbd477435-kube-api-access-tmf2b\") pod \"infra-operator-index-dpb2h\" (UID: \"2d9ded7f-f632-43d6-bcc8-b6fbbd477435\") " pod="openstack-operators/infra-operator-index-dpb2h" Nov 26 22:52:15 crc kubenswrapper[5008]: I1126 22:52:15.288435 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-dpb2h" Nov 26 22:52:15 crc kubenswrapper[5008]: I1126 22:52:15.505695 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-dpb2h"] Nov 26 22:52:16 crc kubenswrapper[5008]: I1126 22:52:16.397159 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-dpb2h" event={"ID":"2d9ded7f-f632-43d6-bcc8-b6fbbd477435","Type":"ContainerStarted","Data":"250127d867d643af311924b7a780a58bfa0c8e42b6adb9a5bb3574f9de8fddc6"} Nov 26 22:52:17 crc kubenswrapper[5008]: I1126 22:52:17.405566 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-dpb2h" event={"ID":"2d9ded7f-f632-43d6-bcc8-b6fbbd477435","Type":"ContainerStarted","Data":"857c79b71e9518579f2f0d282c38d1929a9c0cf6f95a7cd62e455487bcfded1a"} Nov 26 22:52:17 crc kubenswrapper[5008]: I1126 22:52:17.431790 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-dpb2h" podStartSLOduration=1.921885506 podStartE2EDuration="3.431758527s" podCreationTimestamp="2025-11-26 22:52:14 +0000 UTC" firstStartedPulling="2025-11-26 22:52:15.527526243 +0000 UTC m=+810.940220245" lastFinishedPulling="2025-11-26 22:52:17.037399264 +0000 UTC m=+812.450093266" observedRunningTime="2025-11-26 22:52:17.425098227 +0000 UTC m=+812.837792279" watchObservedRunningTime="2025-11-26 22:52:17.431758527 +0000 UTC m=+812.844452569" Nov 26 22:52:19 crc kubenswrapper[5008]: I1126 22:52:19.313631 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-dpb2h"] Nov 26 22:52:19 crc kubenswrapper[5008]: I1126 22:52:19.421004 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-dpb2h" podUID="2d9ded7f-f632-43d6-bcc8-b6fbbd477435" containerName="registry-server" containerID="cri-o://857c79b71e9518579f2f0d282c38d1929a9c0cf6f95a7cd62e455487bcfded1a" gracePeriod=2 Nov 26 22:52:19 crc kubenswrapper[5008]: I1126 22:52:19.868232 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-dpb2h" Nov 26 22:52:19 crc kubenswrapper[5008]: I1126 22:52:19.921367 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-mlmbf"] Nov 26 22:52:19 crc kubenswrapper[5008]: E1126 22:52:19.921664 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9ded7f-f632-43d6-bcc8-b6fbbd477435" containerName="registry-server" Nov 26 22:52:19 crc kubenswrapper[5008]: I1126 22:52:19.921690 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9ded7f-f632-43d6-bcc8-b6fbbd477435" containerName="registry-server" Nov 26 22:52:19 crc kubenswrapper[5008]: I1126 22:52:19.921878 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d9ded7f-f632-43d6-bcc8-b6fbbd477435" containerName="registry-server" Nov 26 22:52:19 crc kubenswrapper[5008]: I1126 22:52:19.922373 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mlmbf" Nov 26 22:52:19 crc kubenswrapper[5008]: I1126 22:52:19.935405 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-mlmbf"] Nov 26 22:52:19 crc kubenswrapper[5008]: I1126 22:52:19.943627 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmf2b\" (UniqueName: \"kubernetes.io/projected/2d9ded7f-f632-43d6-bcc8-b6fbbd477435-kube-api-access-tmf2b\") pod \"2d9ded7f-f632-43d6-bcc8-b6fbbd477435\" (UID: \"2d9ded7f-f632-43d6-bcc8-b6fbbd477435\") " Nov 26 22:52:19 crc kubenswrapper[5008]: I1126 22:52:19.953202 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d9ded7f-f632-43d6-bcc8-b6fbbd477435-kube-api-access-tmf2b" (OuterVolumeSpecName: "kube-api-access-tmf2b") pod "2d9ded7f-f632-43d6-bcc8-b6fbbd477435" (UID: "2d9ded7f-f632-43d6-bcc8-b6fbbd477435"). InnerVolumeSpecName "kube-api-access-tmf2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.045875 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t26cg\" (UniqueName: \"kubernetes.io/projected/ce1e5e7a-1c65-4bec-b3e8-a8181067cbb8-kube-api-access-t26cg\") pod \"infra-operator-index-mlmbf\" (UID: \"ce1e5e7a-1c65-4bec-b3e8-a8181067cbb8\") " pod="openstack-operators/infra-operator-index-mlmbf" Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.046101 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmf2b\" (UniqueName: \"kubernetes.io/projected/2d9ded7f-f632-43d6-bcc8-b6fbbd477435-kube-api-access-tmf2b\") on node \"crc\" DevicePath \"\"" Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.147945 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t26cg\" (UniqueName: \"kubernetes.io/projected/ce1e5e7a-1c65-4bec-b3e8-a8181067cbb8-kube-api-access-t26cg\") pod \"infra-operator-index-mlmbf\" (UID: \"ce1e5e7a-1c65-4bec-b3e8-a8181067cbb8\") " pod="openstack-operators/infra-operator-index-mlmbf" Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.170508 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t26cg\" (UniqueName: \"kubernetes.io/projected/ce1e5e7a-1c65-4bec-b3e8-a8181067cbb8-kube-api-access-t26cg\") pod \"infra-operator-index-mlmbf\" (UID: \"ce1e5e7a-1c65-4bec-b3e8-a8181067cbb8\") " pod="openstack-operators/infra-operator-index-mlmbf" Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.254169 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mlmbf" Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.442374 5008 generic.go:334] "Generic (PLEG): container finished" podID="2d9ded7f-f632-43d6-bcc8-b6fbbd477435" containerID="857c79b71e9518579f2f0d282c38d1929a9c0cf6f95a7cd62e455487bcfded1a" exitCode=0 Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.442596 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-dpb2h" event={"ID":"2d9ded7f-f632-43d6-bcc8-b6fbbd477435","Type":"ContainerDied","Data":"857c79b71e9518579f2f0d282c38d1929a9c0cf6f95a7cd62e455487bcfded1a"} Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.442769 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-dpb2h" event={"ID":"2d9ded7f-f632-43d6-bcc8-b6fbbd477435","Type":"ContainerDied","Data":"250127d867d643af311924b7a780a58bfa0c8e42b6adb9a5bb3574f9de8fddc6"} Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.442799 5008 scope.go:117] "RemoveContainer" containerID="857c79b71e9518579f2f0d282c38d1929a9c0cf6f95a7cd62e455487bcfded1a" Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.442685 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-dpb2h" Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.469494 5008 scope.go:117] "RemoveContainer" containerID="857c79b71e9518579f2f0d282c38d1929a9c0cf6f95a7cd62e455487bcfded1a" Nov 26 22:52:20 crc kubenswrapper[5008]: E1126 22:52:20.469923 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"857c79b71e9518579f2f0d282c38d1929a9c0cf6f95a7cd62e455487bcfded1a\": container with ID starting with 857c79b71e9518579f2f0d282c38d1929a9c0cf6f95a7cd62e455487bcfded1a not found: ID does not exist" containerID="857c79b71e9518579f2f0d282c38d1929a9c0cf6f95a7cd62e455487bcfded1a" Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.469945 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857c79b71e9518579f2f0d282c38d1929a9c0cf6f95a7cd62e455487bcfded1a"} err="failed to get container status \"857c79b71e9518579f2f0d282c38d1929a9c0cf6f95a7cd62e455487bcfded1a\": rpc error: code = NotFound desc = could not find container \"857c79b71e9518579f2f0d282c38d1929a9c0cf6f95a7cd62e455487bcfded1a\": container with ID starting with 857c79b71e9518579f2f0d282c38d1929a9c0cf6f95a7cd62e455487bcfded1a not found: ID does not exist" Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.481017 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-dpb2h"] Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.484158 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-dpb2h"] Nov 26 22:52:20 crc kubenswrapper[5008]: I1126 22:52:20.758902 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-mlmbf"] Nov 26 22:52:21 crc kubenswrapper[5008]: I1126 22:52:21.451532 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mlmbf" event={"ID":"ce1e5e7a-1c65-4bec-b3e8-a8181067cbb8","Type":"ContainerStarted","Data":"a7b81227118427b3fbca55c3fb64818b2efb1a4b60a3aac40b1a176d1db21c4b"} Nov 26 22:52:21 crc kubenswrapper[5008]: I1126 22:52:21.529852 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d9ded7f-f632-43d6-bcc8-b6fbbd477435" path="/var/lib/kubelet/pods/2d9ded7f-f632-43d6-bcc8-b6fbbd477435/volumes" Nov 26 22:52:22 crc kubenswrapper[5008]: I1126 22:52:22.462446 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mlmbf" event={"ID":"ce1e5e7a-1c65-4bec-b3e8-a8181067cbb8","Type":"ContainerStarted","Data":"d4ad9cb727d2cd1525520459ff06823052b2fee0dcc209f6db549f95e11f1c4d"} Nov 26 22:52:22 crc kubenswrapper[5008]: I1126 22:52:22.487444 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-mlmbf" podStartSLOduration=2.8592999199999998 podStartE2EDuration="3.48742273s" podCreationTimestamp="2025-11-26 22:52:19 +0000 UTC" firstStartedPulling="2025-11-26 22:52:20.780801035 +0000 UTC m=+816.193495067" lastFinishedPulling="2025-11-26 22:52:21.408923865 +0000 UTC m=+816.821617877" observedRunningTime="2025-11-26 22:52:22.484809518 +0000 UTC m=+817.897503560" watchObservedRunningTime="2025-11-26 22:52:22.48742273 +0000 UTC m=+817.900116772" Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.146103 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cc2l6"] Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.148372 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.162392 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cc2l6"] Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.178753 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b54bf5-b620-4011-ae50-dbc43c938ca5-catalog-content\") pod \"redhat-marketplace-cc2l6\" (UID: \"21b54bf5-b620-4011-ae50-dbc43c938ca5\") " pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.179136 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd9pw\" (UniqueName: \"kubernetes.io/projected/21b54bf5-b620-4011-ae50-dbc43c938ca5-kube-api-access-bd9pw\") pod \"redhat-marketplace-cc2l6\" (UID: \"21b54bf5-b620-4011-ae50-dbc43c938ca5\") " pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.179251 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b54bf5-b620-4011-ae50-dbc43c938ca5-utilities\") pod \"redhat-marketplace-cc2l6\" (UID: \"21b54bf5-b620-4011-ae50-dbc43c938ca5\") " pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.280630 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b54bf5-b620-4011-ae50-dbc43c938ca5-catalog-content\") pod \"redhat-marketplace-cc2l6\" (UID: \"21b54bf5-b620-4011-ae50-dbc43c938ca5\") " pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.280695 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd9pw\" (UniqueName: \"kubernetes.io/projected/21b54bf5-b620-4011-ae50-dbc43c938ca5-kube-api-access-bd9pw\") pod \"redhat-marketplace-cc2l6\" (UID: \"21b54bf5-b620-4011-ae50-dbc43c938ca5\") " pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.280727 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b54bf5-b620-4011-ae50-dbc43c938ca5-utilities\") pod \"redhat-marketplace-cc2l6\" (UID: \"21b54bf5-b620-4011-ae50-dbc43c938ca5\") " pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.281255 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b54bf5-b620-4011-ae50-dbc43c938ca5-catalog-content\") pod \"redhat-marketplace-cc2l6\" (UID: \"21b54bf5-b620-4011-ae50-dbc43c938ca5\") " pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.281264 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b54bf5-b620-4011-ae50-dbc43c938ca5-utilities\") pod \"redhat-marketplace-cc2l6\" (UID: \"21b54bf5-b620-4011-ae50-dbc43c938ca5\") " pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.308563 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd9pw\" (UniqueName: \"kubernetes.io/projected/21b54bf5-b620-4011-ae50-dbc43c938ca5-kube-api-access-bd9pw\") pod \"redhat-marketplace-cc2l6\" (UID: \"21b54bf5-b620-4011-ae50-dbc43c938ca5\") " pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.490705 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:28 crc kubenswrapper[5008]: I1126 22:52:28.719986 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cc2l6"] Nov 26 22:52:28 crc kubenswrapper[5008]: W1126 22:52:28.734652 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21b54bf5_b620_4011_ae50_dbc43c938ca5.slice/crio-6ea9820187eb4f9b266d5705e3c1a9988bc06f235a7d26f512a07d8ad0706ec7 WatchSource:0}: Error finding container 6ea9820187eb4f9b266d5705e3c1a9988bc06f235a7d26f512a07d8ad0706ec7: Status 404 returned error can't find the container with id 6ea9820187eb4f9b266d5705e3c1a9988bc06f235a7d26f512a07d8ad0706ec7 Nov 26 22:52:29 crc kubenswrapper[5008]: I1126 22:52:29.518299 5008 generic.go:334] "Generic (PLEG): container finished" podID="21b54bf5-b620-4011-ae50-dbc43c938ca5" containerID="e6e96781910ea6f63218dd4a98669c7a13309664e4434020489037d59bc96785" exitCode=0 Nov 26 22:52:29 crc kubenswrapper[5008]: I1126 22:52:29.533996 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cc2l6" event={"ID":"21b54bf5-b620-4011-ae50-dbc43c938ca5","Type":"ContainerDied","Data":"e6e96781910ea6f63218dd4a98669c7a13309664e4434020489037d59bc96785"} Nov 26 22:52:29 crc kubenswrapper[5008]: I1126 22:52:29.534196 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cc2l6" event={"ID":"21b54bf5-b620-4011-ae50-dbc43c938ca5","Type":"ContainerStarted","Data":"6ea9820187eb4f9b266d5705e3c1a9988bc06f235a7d26f512a07d8ad0706ec7"} Nov 26 22:52:30 crc kubenswrapper[5008]: I1126 22:52:30.255109 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-mlmbf" Nov 26 22:52:30 crc kubenswrapper[5008]: I1126 22:52:30.255387 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-mlmbf" Nov 26 22:52:30 crc kubenswrapper[5008]: I1126 22:52:30.302660 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-mlmbf" Nov 26 22:52:30 crc kubenswrapper[5008]: I1126 22:52:30.525069 5008 generic.go:334] "Generic (PLEG): container finished" podID="21b54bf5-b620-4011-ae50-dbc43c938ca5" containerID="897f5f413f21ce53df96d0d27a6e5e6aeae1a5da108e36d364e16591343ee249" exitCode=0 Nov 26 22:52:30 crc kubenswrapper[5008]: I1126 22:52:30.525288 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cc2l6" event={"ID":"21b54bf5-b620-4011-ae50-dbc43c938ca5","Type":"ContainerDied","Data":"897f5f413f21ce53df96d0d27a6e5e6aeae1a5da108e36d364e16591343ee249"} Nov 26 22:52:30 crc kubenswrapper[5008]: I1126 22:52:30.554316 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-mlmbf" Nov 26 22:52:31 crc kubenswrapper[5008]: I1126 22:52:31.534809 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cc2l6" event={"ID":"21b54bf5-b620-4011-ae50-dbc43c938ca5","Type":"ContainerStarted","Data":"5ba29993c62c091a76db4962d7ecb46628a562038ff1015f161f9b3500b14ead"} Nov 26 22:52:31 crc kubenswrapper[5008]: I1126 22:52:31.570559 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cc2l6" podStartSLOduration=2.13562768 podStartE2EDuration="3.570538444s" podCreationTimestamp="2025-11-26 22:52:28 +0000 UTC" firstStartedPulling="2025-11-26 22:52:29.52336625 +0000 UTC m=+824.936060292" lastFinishedPulling="2025-11-26 22:52:30.958277044 +0000 UTC m=+826.370971056" observedRunningTime="2025-11-26 22:52:31.56944683 +0000 UTC m=+826.982140822" watchObservedRunningTime="2025-11-26 22:52:31.570538444 +0000 UTC m=+826.983232456" Nov 26 22:52:33 crc kubenswrapper[5008]: I1126 22:52:33.768704 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l"] Nov 26 22:52:33 crc kubenswrapper[5008]: I1126 22:52:33.771375 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" Nov 26 22:52:33 crc kubenswrapper[5008]: I1126 22:52:33.774198 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6kdt6" Nov 26 22:52:33 crc kubenswrapper[5008]: I1126 22:52:33.798497 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l"] Nov 26 22:52:33 crc kubenswrapper[5008]: I1126 22:52:33.859463 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3032b07d-49b2-49d7-bc65-3523602306e7-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l\" (UID: \"3032b07d-49b2-49d7-bc65-3523602306e7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" Nov 26 22:52:33 crc kubenswrapper[5008]: I1126 22:52:33.859570 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2t2x\" (UniqueName: \"kubernetes.io/projected/3032b07d-49b2-49d7-bc65-3523602306e7-kube-api-access-b2t2x\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l\" (UID: \"3032b07d-49b2-49d7-bc65-3523602306e7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" Nov 26 22:52:33 crc kubenswrapper[5008]: I1126 22:52:33.859606 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3032b07d-49b2-49d7-bc65-3523602306e7-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l\" (UID: \"3032b07d-49b2-49d7-bc65-3523602306e7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" Nov 26 22:52:33 crc kubenswrapper[5008]: I1126 22:52:33.961181 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2t2x\" (UniqueName: \"kubernetes.io/projected/3032b07d-49b2-49d7-bc65-3523602306e7-kube-api-access-b2t2x\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l\" (UID: \"3032b07d-49b2-49d7-bc65-3523602306e7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" Nov 26 22:52:33 crc kubenswrapper[5008]: I1126 22:52:33.961282 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3032b07d-49b2-49d7-bc65-3523602306e7-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l\" (UID: \"3032b07d-49b2-49d7-bc65-3523602306e7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" Nov 26 22:52:33 crc kubenswrapper[5008]: I1126 22:52:33.961400 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3032b07d-49b2-49d7-bc65-3523602306e7-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l\" (UID: \"3032b07d-49b2-49d7-bc65-3523602306e7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" Nov 26 22:52:33 crc kubenswrapper[5008]: I1126 22:52:33.962656 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3032b07d-49b2-49d7-bc65-3523602306e7-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l\" (UID: \"3032b07d-49b2-49d7-bc65-3523602306e7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" Nov 26 22:52:33 crc kubenswrapper[5008]: I1126 22:52:33.962786 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3032b07d-49b2-49d7-bc65-3523602306e7-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l\" (UID: \"3032b07d-49b2-49d7-bc65-3523602306e7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" Nov 26 22:52:33 crc kubenswrapper[5008]: I1126 22:52:33.993217 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2t2x\" (UniqueName: \"kubernetes.io/projected/3032b07d-49b2-49d7-bc65-3523602306e7-kube-api-access-b2t2x\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l\" (UID: \"3032b07d-49b2-49d7-bc65-3523602306e7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" Nov 26 22:52:34 crc kubenswrapper[5008]: I1126 22:52:34.100091 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" Nov 26 22:52:34 crc kubenswrapper[5008]: I1126 22:52:34.502581 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l"] Nov 26 22:52:34 crc kubenswrapper[5008]: W1126 22:52:34.512878 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3032b07d_49b2_49d7_bc65_3523602306e7.slice/crio-5cdd16e471bd7ab75e619e1923b8dcc474a61915edb13c495a84f02857d51472 WatchSource:0}: Error finding container 5cdd16e471bd7ab75e619e1923b8dcc474a61915edb13c495a84f02857d51472: Status 404 returned error can't find the container with id 5cdd16e471bd7ab75e619e1923b8dcc474a61915edb13c495a84f02857d51472 Nov 26 22:52:34 crc kubenswrapper[5008]: I1126 22:52:34.556983 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" event={"ID":"3032b07d-49b2-49d7-bc65-3523602306e7","Type":"ContainerStarted","Data":"5cdd16e471bd7ab75e619e1923b8dcc474a61915edb13c495a84f02857d51472"} Nov 26 22:52:35 crc kubenswrapper[5008]: I1126 22:52:35.567285 5008 generic.go:334] "Generic (PLEG): container finished" podID="3032b07d-49b2-49d7-bc65-3523602306e7" containerID="350f5ab70420158ffe16cefb4d31522500ef1b534116e7a073e0478d0fa7c5da" exitCode=0 Nov 26 22:52:35 crc kubenswrapper[5008]: I1126 22:52:35.567352 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" event={"ID":"3032b07d-49b2-49d7-bc65-3523602306e7","Type":"ContainerDied","Data":"350f5ab70420158ffe16cefb4d31522500ef1b534116e7a073e0478d0fa7c5da"} Nov 26 22:52:37 crc kubenswrapper[5008]: I1126 22:52:37.593174 5008 generic.go:334] "Generic (PLEG): container finished" podID="3032b07d-49b2-49d7-bc65-3523602306e7" containerID="ecb8cc0d201fc67d824cf36f54a68fd1dc5cbd3ffaa21dc4e20562982fbceeb5" exitCode=0 Nov 26 22:52:37 crc kubenswrapper[5008]: I1126 22:52:37.593643 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" event={"ID":"3032b07d-49b2-49d7-bc65-3523602306e7","Type":"ContainerDied","Data":"ecb8cc0d201fc67d824cf36f54a68fd1dc5cbd3ffaa21dc4e20562982fbceeb5"} Nov 26 22:52:38 crc kubenswrapper[5008]: I1126 22:52:38.493774 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:38 crc kubenswrapper[5008]: I1126 22:52:38.493842 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:38 crc kubenswrapper[5008]: I1126 22:52:38.608584 5008 generic.go:334] "Generic (PLEG): container finished" podID="3032b07d-49b2-49d7-bc65-3523602306e7" containerID="aaa34dca8f8391a70ec6c2acea9d4cf4bcb7c60e120eb44aa94dc8945a5e85d9" exitCode=0 Nov 26 22:52:38 crc kubenswrapper[5008]: I1126 22:52:38.608650 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" event={"ID":"3032b07d-49b2-49d7-bc65-3523602306e7","Type":"ContainerDied","Data":"aaa34dca8f8391a70ec6c2acea9d4cf4bcb7c60e120eb44aa94dc8945a5e85d9"} Nov 26 22:52:38 crc kubenswrapper[5008]: I1126 22:52:38.663639 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:38 crc kubenswrapper[5008]: I1126 22:52:38.727321 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:39 crc kubenswrapper[5008]: I1126 22:52:39.979448 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.050854 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2t2x\" (UniqueName: \"kubernetes.io/projected/3032b07d-49b2-49d7-bc65-3523602306e7-kube-api-access-b2t2x\") pod \"3032b07d-49b2-49d7-bc65-3523602306e7\" (UID: \"3032b07d-49b2-49d7-bc65-3523602306e7\") " Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.051281 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3032b07d-49b2-49d7-bc65-3523602306e7-bundle\") pod \"3032b07d-49b2-49d7-bc65-3523602306e7\" (UID: \"3032b07d-49b2-49d7-bc65-3523602306e7\") " Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.051523 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3032b07d-49b2-49d7-bc65-3523602306e7-util\") pod \"3032b07d-49b2-49d7-bc65-3523602306e7\" (UID: \"3032b07d-49b2-49d7-bc65-3523602306e7\") " Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.053431 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3032b07d-49b2-49d7-bc65-3523602306e7-bundle" (OuterVolumeSpecName: "bundle") pod "3032b07d-49b2-49d7-bc65-3523602306e7" (UID: "3032b07d-49b2-49d7-bc65-3523602306e7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.062178 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3032b07d-49b2-49d7-bc65-3523602306e7-kube-api-access-b2t2x" (OuterVolumeSpecName: "kube-api-access-b2t2x") pod "3032b07d-49b2-49d7-bc65-3523602306e7" (UID: "3032b07d-49b2-49d7-bc65-3523602306e7"). InnerVolumeSpecName "kube-api-access-b2t2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.119156 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cc2l6"] Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.154399 5008 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3032b07d-49b2-49d7-bc65-3523602306e7-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.154451 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2t2x\" (UniqueName: \"kubernetes.io/projected/3032b07d-49b2-49d7-bc65-3523602306e7-kube-api-access-b2t2x\") on node \"crc\" DevicePath \"\"" Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.228080 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3032b07d-49b2-49d7-bc65-3523602306e7-util" (OuterVolumeSpecName: "util") pod "3032b07d-49b2-49d7-bc65-3523602306e7" (UID: "3032b07d-49b2-49d7-bc65-3523602306e7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.256028 5008 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3032b07d-49b2-49d7-bc65-3523602306e7-util\") on node \"crc\" DevicePath \"\"" Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.628480 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.628474 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d89s5l" event={"ID":"3032b07d-49b2-49d7-bc65-3523602306e7","Type":"ContainerDied","Data":"5cdd16e471bd7ab75e619e1923b8dcc474a61915edb13c495a84f02857d51472"} Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.630120 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cdd16e471bd7ab75e619e1923b8dcc474a61915edb13c495a84f02857d51472" Nov 26 22:52:40 crc kubenswrapper[5008]: I1126 22:52:40.628635 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cc2l6" podUID="21b54bf5-b620-4011-ae50-dbc43c938ca5" containerName="registry-server" containerID="cri-o://5ba29993c62c091a76db4962d7ecb46628a562038ff1015f161f9b3500b14ead" gracePeriod=2 Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.055421 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.171263 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b54bf5-b620-4011-ae50-dbc43c938ca5-utilities\") pod \"21b54bf5-b620-4011-ae50-dbc43c938ca5\" (UID: \"21b54bf5-b620-4011-ae50-dbc43c938ca5\") " Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.171622 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd9pw\" (UniqueName: \"kubernetes.io/projected/21b54bf5-b620-4011-ae50-dbc43c938ca5-kube-api-access-bd9pw\") pod \"21b54bf5-b620-4011-ae50-dbc43c938ca5\" (UID: \"21b54bf5-b620-4011-ae50-dbc43c938ca5\") " Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.171901 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b54bf5-b620-4011-ae50-dbc43c938ca5-catalog-content\") pod \"21b54bf5-b620-4011-ae50-dbc43c938ca5\" (UID: \"21b54bf5-b620-4011-ae50-dbc43c938ca5\") " Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.172210 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21b54bf5-b620-4011-ae50-dbc43c938ca5-utilities" (OuterVolumeSpecName: "utilities") pod "21b54bf5-b620-4011-ae50-dbc43c938ca5" (UID: "21b54bf5-b620-4011-ae50-dbc43c938ca5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.172405 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b54bf5-b620-4011-ae50-dbc43c938ca5-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.176338 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b54bf5-b620-4011-ae50-dbc43c938ca5-kube-api-access-bd9pw" (OuterVolumeSpecName: "kube-api-access-bd9pw") pod "21b54bf5-b620-4011-ae50-dbc43c938ca5" (UID: "21b54bf5-b620-4011-ae50-dbc43c938ca5"). InnerVolumeSpecName "kube-api-access-bd9pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.209209 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21b54bf5-b620-4011-ae50-dbc43c938ca5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21b54bf5-b620-4011-ae50-dbc43c938ca5" (UID: "21b54bf5-b620-4011-ae50-dbc43c938ca5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.274243 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd9pw\" (UniqueName: \"kubernetes.io/projected/21b54bf5-b620-4011-ae50-dbc43c938ca5-kube-api-access-bd9pw\") on node \"crc\" DevicePath \"\"" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.274292 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b54bf5-b620-4011-ae50-dbc43c938ca5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.637300 5008 generic.go:334] "Generic (PLEG): container finished" podID="21b54bf5-b620-4011-ae50-dbc43c938ca5" containerID="5ba29993c62c091a76db4962d7ecb46628a562038ff1015f161f9b3500b14ead" exitCode=0 Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.637351 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cc2l6" event={"ID":"21b54bf5-b620-4011-ae50-dbc43c938ca5","Type":"ContainerDied","Data":"5ba29993c62c091a76db4962d7ecb46628a562038ff1015f161f9b3500b14ead"} Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.637391 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cc2l6" event={"ID":"21b54bf5-b620-4011-ae50-dbc43c938ca5","Type":"ContainerDied","Data":"6ea9820187eb4f9b266d5705e3c1a9988bc06f235a7d26f512a07d8ad0706ec7"} Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.637423 5008 scope.go:117] "RemoveContainer" containerID="5ba29993c62c091a76db4962d7ecb46628a562038ff1015f161f9b3500b14ead" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.637610 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cc2l6" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.656956 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cc2l6"] Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.657947 5008 scope.go:117] "RemoveContainer" containerID="897f5f413f21ce53df96d0d27a6e5e6aeae1a5da108e36d364e16591343ee249" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.659384 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cc2l6"] Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.671229 5008 scope.go:117] "RemoveContainer" containerID="e6e96781910ea6f63218dd4a98669c7a13309664e4434020489037d59bc96785" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.686667 5008 scope.go:117] "RemoveContainer" containerID="5ba29993c62c091a76db4962d7ecb46628a562038ff1015f161f9b3500b14ead" Nov 26 22:52:41 crc kubenswrapper[5008]: E1126 22:52:41.687264 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba29993c62c091a76db4962d7ecb46628a562038ff1015f161f9b3500b14ead\": container with ID starting with 5ba29993c62c091a76db4962d7ecb46628a562038ff1015f161f9b3500b14ead not found: ID does not exist" containerID="5ba29993c62c091a76db4962d7ecb46628a562038ff1015f161f9b3500b14ead" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.687311 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba29993c62c091a76db4962d7ecb46628a562038ff1015f161f9b3500b14ead"} err="failed to get container status \"5ba29993c62c091a76db4962d7ecb46628a562038ff1015f161f9b3500b14ead\": rpc error: code = NotFound desc = could not find container \"5ba29993c62c091a76db4962d7ecb46628a562038ff1015f161f9b3500b14ead\": container with ID starting with 5ba29993c62c091a76db4962d7ecb46628a562038ff1015f161f9b3500b14ead not found: ID does not exist" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.687340 5008 scope.go:117] "RemoveContainer" containerID="897f5f413f21ce53df96d0d27a6e5e6aeae1a5da108e36d364e16591343ee249" Nov 26 22:52:41 crc kubenswrapper[5008]: E1126 22:52:41.687668 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897f5f413f21ce53df96d0d27a6e5e6aeae1a5da108e36d364e16591343ee249\": container with ID starting with 897f5f413f21ce53df96d0d27a6e5e6aeae1a5da108e36d364e16591343ee249 not found: ID does not exist" containerID="897f5f413f21ce53df96d0d27a6e5e6aeae1a5da108e36d364e16591343ee249" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.687751 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897f5f413f21ce53df96d0d27a6e5e6aeae1a5da108e36d364e16591343ee249"} err="failed to get container status \"897f5f413f21ce53df96d0d27a6e5e6aeae1a5da108e36d364e16591343ee249\": rpc error: code = NotFound desc = could not find container \"897f5f413f21ce53df96d0d27a6e5e6aeae1a5da108e36d364e16591343ee249\": container with ID starting with 897f5f413f21ce53df96d0d27a6e5e6aeae1a5da108e36d364e16591343ee249 not found: ID does not exist" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.687812 5008 scope.go:117] "RemoveContainer" containerID="e6e96781910ea6f63218dd4a98669c7a13309664e4434020489037d59bc96785" Nov 26 22:52:41 crc kubenswrapper[5008]: E1126 22:52:41.688109 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6e96781910ea6f63218dd4a98669c7a13309664e4434020489037d59bc96785\": container with ID starting with e6e96781910ea6f63218dd4a98669c7a13309664e4434020489037d59bc96785 not found: ID does not exist" containerID="e6e96781910ea6f63218dd4a98669c7a13309664e4434020489037d59bc96785" Nov 26 22:52:41 crc kubenswrapper[5008]: I1126 22:52:41.688137 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e96781910ea6f63218dd4a98669c7a13309664e4434020489037d59bc96785"} err="failed to get container status \"e6e96781910ea6f63218dd4a98669c7a13309664e4434020489037d59bc96785\": rpc error: code = NotFound desc = could not find container \"e6e96781910ea6f63218dd4a98669c7a13309664e4434020489037d59bc96785\": container with ID starting with e6e96781910ea6f63218dd4a98669c7a13309664e4434020489037d59bc96785 not found: ID does not exist" Nov 26 22:52:43 crc kubenswrapper[5008]: I1126 22:52:43.534421 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b54bf5-b620-4011-ae50-dbc43c938ca5" path="/var/lib/kubelet/pods/21b54bf5-b620-4011-ae50-dbc43c938ca5/volumes" Nov 26 22:52:46 crc kubenswrapper[5008]: I1126 22:52:46.962900 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt"] Nov 26 22:52:46 crc kubenswrapper[5008]: E1126 22:52:46.963657 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3032b07d-49b2-49d7-bc65-3523602306e7" containerName="extract" Nov 26 22:52:46 crc kubenswrapper[5008]: I1126 22:52:46.963671 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3032b07d-49b2-49d7-bc65-3523602306e7" containerName="extract" Nov 26 22:52:46 crc kubenswrapper[5008]: E1126 22:52:46.963683 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3032b07d-49b2-49d7-bc65-3523602306e7" containerName="util" Nov 26 22:52:46 crc kubenswrapper[5008]: I1126 22:52:46.963689 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3032b07d-49b2-49d7-bc65-3523602306e7" containerName="util" Nov 26 22:52:46 crc kubenswrapper[5008]: E1126 22:52:46.963703 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b54bf5-b620-4011-ae50-dbc43c938ca5" containerName="extract-content" Nov 26 22:52:46 crc kubenswrapper[5008]: I1126 22:52:46.963713 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b54bf5-b620-4011-ae50-dbc43c938ca5" containerName="extract-content" Nov 26 22:52:46 crc kubenswrapper[5008]: E1126 22:52:46.963727 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3032b07d-49b2-49d7-bc65-3523602306e7" containerName="pull" Nov 26 22:52:46 crc kubenswrapper[5008]: I1126 22:52:46.963734 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3032b07d-49b2-49d7-bc65-3523602306e7" containerName="pull" Nov 26 22:52:46 crc kubenswrapper[5008]: E1126 22:52:46.963748 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b54bf5-b620-4011-ae50-dbc43c938ca5" containerName="extract-utilities" Nov 26 22:52:46 crc kubenswrapper[5008]: I1126 22:52:46.963754 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b54bf5-b620-4011-ae50-dbc43c938ca5" containerName="extract-utilities" Nov 26 22:52:46 crc kubenswrapper[5008]: E1126 22:52:46.963761 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b54bf5-b620-4011-ae50-dbc43c938ca5" containerName="registry-server" Nov 26 22:52:46 crc kubenswrapper[5008]: I1126 22:52:46.963768 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b54bf5-b620-4011-ae50-dbc43c938ca5" containerName="registry-server" Nov 26 22:52:46 crc kubenswrapper[5008]: I1126 22:52:46.963871 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="3032b07d-49b2-49d7-bc65-3523602306e7" containerName="extract" Nov 26 22:52:46 crc kubenswrapper[5008]: I1126 22:52:46.963885 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b54bf5-b620-4011-ae50-dbc43c938ca5" containerName="registry-server" Nov 26 22:52:46 crc kubenswrapper[5008]: I1126 22:52:46.964615 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 22:52:46 crc kubenswrapper[5008]: I1126 22:52:46.966702 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-d5mdr" Nov 26 22:52:46 crc kubenswrapper[5008]: I1126 22:52:46.969345 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Nov 26 22:52:46 crc kubenswrapper[5008]: I1126 22:52:46.979221 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt"] Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.048804 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q2fd\" (UniqueName: \"kubernetes.io/projected/9309fbab-00dc-4e76-a384-b9297f098fe9-kube-api-access-8q2fd\") pod \"infra-operator-controller-manager-5476c5fbf7-zg2vt\" (UID: \"9309fbab-00dc-4e76-a384-b9297f098fe9\") " pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.048859 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9309fbab-00dc-4e76-a384-b9297f098fe9-webhook-cert\") pod \"infra-operator-controller-manager-5476c5fbf7-zg2vt\" (UID: \"9309fbab-00dc-4e76-a384-b9297f098fe9\") " pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.048892 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9309fbab-00dc-4e76-a384-b9297f098fe9-apiservice-cert\") pod \"infra-operator-controller-manager-5476c5fbf7-zg2vt\" (UID: \"9309fbab-00dc-4e76-a384-b9297f098fe9\") " pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.150646 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9309fbab-00dc-4e76-a384-b9297f098fe9-webhook-cert\") pod \"infra-operator-controller-manager-5476c5fbf7-zg2vt\" (UID: \"9309fbab-00dc-4e76-a384-b9297f098fe9\") " pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.150710 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9309fbab-00dc-4e76-a384-b9297f098fe9-apiservice-cert\") pod \"infra-operator-controller-manager-5476c5fbf7-zg2vt\" (UID: \"9309fbab-00dc-4e76-a384-b9297f098fe9\") " pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.150799 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q2fd\" (UniqueName: \"kubernetes.io/projected/9309fbab-00dc-4e76-a384-b9297f098fe9-kube-api-access-8q2fd\") pod \"infra-operator-controller-manager-5476c5fbf7-zg2vt\" (UID: \"9309fbab-00dc-4e76-a384-b9297f098fe9\") " pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.157175 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9309fbab-00dc-4e76-a384-b9297f098fe9-apiservice-cert\") pod \"infra-operator-controller-manager-5476c5fbf7-zg2vt\" (UID: \"9309fbab-00dc-4e76-a384-b9297f098fe9\") " pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.157303 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9309fbab-00dc-4e76-a384-b9297f098fe9-webhook-cert\") pod \"infra-operator-controller-manager-5476c5fbf7-zg2vt\" (UID: \"9309fbab-00dc-4e76-a384-b9297f098fe9\") " pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.176011 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q2fd\" (UniqueName: \"kubernetes.io/projected/9309fbab-00dc-4e76-a384-b9297f098fe9-kube-api-access-8q2fd\") pod \"infra-operator-controller-manager-5476c5fbf7-zg2vt\" (UID: \"9309fbab-00dc-4e76-a384-b9297f098fe9\") " pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.278805 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.544430 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.545715 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.548115 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.548587 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.549565 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-ct9sx" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.553949 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.558764 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.559914 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.561699 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.573671 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.591121 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.596476 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.597378 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.610116 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671267 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/15645128-7001-44ca-b06c-25183ae8f5be-kolla-config\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671302 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/15645128-7001-44ca-b06c-25183ae8f5be-config-data-default\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671336 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43262a3b-7a09-4b65-9df6-2bfe519feff5-operator-scripts\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671356 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/15645128-7001-44ca-b06c-25183ae8f5be-config-data-generated\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671373 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0dd65e2a-85a1-40be-826a-d521a2e94607-kolla-config\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671397 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj52x\" (UniqueName: \"kubernetes.io/projected/15645128-7001-44ca-b06c-25183ae8f5be-kube-api-access-gj52x\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671416 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671442 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/43262a3b-7a09-4b65-9df6-2bfe519feff5-config-data-default\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671468 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671510 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15645128-7001-44ca-b06c-25183ae8f5be-operator-scripts\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671526 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dd65e2a-85a1-40be-826a-d521a2e94607-operator-scripts\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671548 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/43262a3b-7a09-4b65-9df6-2bfe519feff5-kolla-config\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671565 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671580 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0dd65e2a-85a1-40be-826a-d521a2e94607-config-data-generated\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671595 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lgqr\" (UniqueName: \"kubernetes.io/projected/0dd65e2a-85a1-40be-826a-d521a2e94607-kube-api-access-5lgqr\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671619 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgk52\" (UniqueName: \"kubernetes.io/projected/43262a3b-7a09-4b65-9df6-2bfe519feff5-kube-api-access-jgk52\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671643 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/43262a3b-7a09-4b65-9df6-2bfe519feff5-config-data-generated\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.671671 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0dd65e2a-85a1-40be-826a-d521a2e94607-config-data-default\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.719509 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt"] Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.773694 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.773933 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj52x\" (UniqueName: \"kubernetes.io/projected/15645128-7001-44ca-b06c-25183ae8f5be-kube-api-access-gj52x\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774034 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/43262a3b-7a09-4b65-9df6-2bfe519feff5-config-data-default\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774104 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774172 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15645128-7001-44ca-b06c-25183ae8f5be-operator-scripts\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774263 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dd65e2a-85a1-40be-826a-d521a2e94607-operator-scripts\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774330 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/43262a3b-7a09-4b65-9df6-2bfe519feff5-kolla-config\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774390 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774449 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0dd65e2a-85a1-40be-826a-d521a2e94607-config-data-generated\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774508 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lgqr\" (UniqueName: \"kubernetes.io/projected/0dd65e2a-85a1-40be-826a-d521a2e94607-kube-api-access-5lgqr\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774584 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgk52\" (UniqueName: \"kubernetes.io/projected/43262a3b-7a09-4b65-9df6-2bfe519feff5-kube-api-access-jgk52\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774651 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/43262a3b-7a09-4b65-9df6-2bfe519feff5-config-data-generated\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774734 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0dd65e2a-85a1-40be-826a-d521a2e94607-config-data-default\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774800 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/15645128-7001-44ca-b06c-25183ae8f5be-config-data-default\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774872 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/15645128-7001-44ca-b06c-25183ae8f5be-kolla-config\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.774945 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43262a3b-7a09-4b65-9df6-2bfe519feff5-operator-scripts\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.775037 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0dd65e2a-85a1-40be-826a-d521a2e94607-kolla-config\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.775102 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/15645128-7001-44ca-b06c-25183ae8f5be-config-data-generated\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.775360 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.775414 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.775582 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/43262a3b-7a09-4b65-9df6-2bfe519feff5-config-data-generated\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.775758 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/15645128-7001-44ca-b06c-25183ae8f5be-config-data-generated\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.775810 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0dd65e2a-85a1-40be-826a-d521a2e94607-config-data-generated\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.776122 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.776149 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/15645128-7001-44ca-b06c-25183ae8f5be-kolla-config\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.776549 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0dd65e2a-85a1-40be-826a-d521a2e94607-kolla-config\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.776723 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dd65e2a-85a1-40be-826a-d521a2e94607-operator-scripts\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.777045 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0dd65e2a-85a1-40be-826a-d521a2e94607-config-data-default\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.777260 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/43262a3b-7a09-4b65-9df6-2bfe519feff5-kolla-config\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.777309 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/15645128-7001-44ca-b06c-25183ae8f5be-config-data-default\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.777626 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43262a3b-7a09-4b65-9df6-2bfe519feff5-operator-scripts\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.777914 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/43262a3b-7a09-4b65-9df6-2bfe519feff5-config-data-default\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.778022 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15645128-7001-44ca-b06c-25183ae8f5be-operator-scripts\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.803077 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.809107 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.821553 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj52x\" (UniqueName: \"kubernetes.io/projected/15645128-7001-44ca-b06c-25183ae8f5be-kube-api-access-gj52x\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.823473 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lgqr\" (UniqueName: \"kubernetes.io/projected/0dd65e2a-85a1-40be-826a-d521a2e94607-kube-api-access-5lgqr\") pod \"openstack-galera-1\" (UID: \"0dd65e2a-85a1-40be-826a-d521a2e94607\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.829471 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgk52\" (UniqueName: \"kubernetes.io/projected/43262a3b-7a09-4b65-9df6-2bfe519feff5-kube-api-access-jgk52\") pod \"openstack-galera-2\" (UID: \"43262a3b-7a09-4b65-9df6-2bfe519feff5\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.835023 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"15645128-7001-44ca-b06c-25183ae8f5be\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.872904 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.890561 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:52:47 crc kubenswrapper[5008]: I1126 22:52:47.912980 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:52:48 crc kubenswrapper[5008]: I1126 22:52:48.277911 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Nov 26 22:52:48 crc kubenswrapper[5008]: I1126 22:52:48.346156 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Nov 26 22:52:48 crc kubenswrapper[5008]: I1126 22:52:48.349521 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Nov 26 22:52:48 crc kubenswrapper[5008]: W1126 22:52:48.357846 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dd65e2a_85a1_40be_826a_d521a2e94607.slice/crio-a85effae4a6d3d9422c3448151c03c94dae7127067b28d0c90e92c46bab73a6c WatchSource:0}: Error finding container a85effae4a6d3d9422c3448151c03c94dae7127067b28d0c90e92c46bab73a6c: Status 404 returned error can't find the container with id a85effae4a6d3d9422c3448151c03c94dae7127067b28d0c90e92c46bab73a6c Nov 26 22:52:48 crc kubenswrapper[5008]: W1126 22:52:48.362198 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43262a3b_7a09_4b65_9df6_2bfe519feff5.slice/crio-472744725c4cddf6ccc2d594cd614553d1e320a156203b006b00f5334a824b5c WatchSource:0}: Error finding container 472744725c4cddf6ccc2d594cd614553d1e320a156203b006b00f5334a824b5c: Status 404 returned error can't find the container with id 472744725c4cddf6ccc2d594cd614553d1e320a156203b006b00f5334a824b5c Nov 26 22:52:48 crc kubenswrapper[5008]: I1126 22:52:48.679217 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"0dd65e2a-85a1-40be-826a-d521a2e94607","Type":"ContainerStarted","Data":"a85effae4a6d3d9422c3448151c03c94dae7127067b28d0c90e92c46bab73a6c"} Nov 26 22:52:48 crc kubenswrapper[5008]: I1126 22:52:48.680665 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"43262a3b-7a09-4b65-9df6-2bfe519feff5","Type":"ContainerStarted","Data":"472744725c4cddf6ccc2d594cd614553d1e320a156203b006b00f5334a824b5c"} Nov 26 22:52:48 crc kubenswrapper[5008]: I1126 22:52:48.681740 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"15645128-7001-44ca-b06c-25183ae8f5be","Type":"ContainerStarted","Data":"92b19baaec962c311f33714d6a422c10cee862bf91877e0a580ec7326cbea076"} Nov 26 22:52:48 crc kubenswrapper[5008]: I1126 22:52:48.683459 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" event={"ID":"9309fbab-00dc-4e76-a384-b9297f098fe9","Type":"ContainerStarted","Data":"95414fd445c8c86bf6178f654403c95288fdeae312cf85e5f646e058d2bda439"} Nov 26 22:52:50 crc kubenswrapper[5008]: I1126 22:52:50.696700 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" event={"ID":"9309fbab-00dc-4e76-a384-b9297f098fe9","Type":"ContainerStarted","Data":"35576893d513cf3dd2135514bfd4cd05a3e2bd8157ec5e3978fe51affc6d23cf"} Nov 26 22:52:56 crc kubenswrapper[5008]: I1126 22:52:56.743486 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"43262a3b-7a09-4b65-9df6-2bfe519feff5","Type":"ContainerStarted","Data":"58c20423443ef7ea5fe9fe7b2794f1f6ac9557ec5396d8c7dd7e80e016e3fb18"} Nov 26 22:52:57 crc kubenswrapper[5008]: I1126 22:52:57.751078 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"15645128-7001-44ca-b06c-25183ae8f5be","Type":"ContainerStarted","Data":"2f3bf4700b0ea9fad7cafcf0b451453aa9a3f5aa8c6096ce1c9f5f1fec21007a"} Nov 26 22:52:58 crc kubenswrapper[5008]: I1126 22:52:58.761144 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" event={"ID":"9309fbab-00dc-4e76-a384-b9297f098fe9","Type":"ContainerStarted","Data":"4c245ed7cea0bd46c84fe3b57edfa1ffaddf1361f4759619d91f9b328b77d3f1"} Nov 26 22:52:58 crc kubenswrapper[5008]: I1126 22:52:58.762435 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 22:52:58 crc kubenswrapper[5008]: I1126 22:52:58.766709 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"0dd65e2a-85a1-40be-826a-d521a2e94607","Type":"ContainerStarted","Data":"2141655bac67028c7f9b7c54059a5f5565c8e88d467250abfe2ce3ae72bbe9f1"} Nov 26 22:52:58 crc kubenswrapper[5008]: I1126 22:52:58.771700 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 22:52:58 crc kubenswrapper[5008]: I1126 22:52:58.786797 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" podStartSLOduration=2.227800972 podStartE2EDuration="12.78678197s" podCreationTimestamp="2025-11-26 22:52:46 +0000 UTC" firstStartedPulling="2025-11-26 22:52:47.733632648 +0000 UTC m=+843.146326650" lastFinishedPulling="2025-11-26 22:52:58.292613636 +0000 UTC m=+853.705307648" observedRunningTime="2025-11-26 22:52:58.783834598 +0000 UTC m=+854.196528610" watchObservedRunningTime="2025-11-26 22:52:58.78678197 +0000 UTC m=+854.199475982" Nov 26 22:53:00 crc kubenswrapper[5008]: I1126 22:53:00.782076 5008 generic.go:334] "Generic (PLEG): container finished" podID="43262a3b-7a09-4b65-9df6-2bfe519feff5" containerID="58c20423443ef7ea5fe9fe7b2794f1f6ac9557ec5396d8c7dd7e80e016e3fb18" exitCode=0 Nov 26 22:53:00 crc kubenswrapper[5008]: I1126 22:53:00.782262 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"43262a3b-7a09-4b65-9df6-2bfe519feff5","Type":"ContainerDied","Data":"58c20423443ef7ea5fe9fe7b2794f1f6ac9557ec5396d8c7dd7e80e016e3fb18"} Nov 26 22:53:01 crc kubenswrapper[5008]: I1126 22:53:01.790375 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"43262a3b-7a09-4b65-9df6-2bfe519feff5","Type":"ContainerStarted","Data":"57b7d6cab3cca22a4d2ad78ece431ceb051a6786918a44bef17f328722b7bb6d"} Nov 26 22:53:01 crc kubenswrapper[5008]: I1126 22:53:01.792844 5008 generic.go:334] "Generic (PLEG): container finished" podID="15645128-7001-44ca-b06c-25183ae8f5be" containerID="2f3bf4700b0ea9fad7cafcf0b451453aa9a3f5aa8c6096ce1c9f5f1fec21007a" exitCode=0 Nov 26 22:53:01 crc kubenswrapper[5008]: I1126 22:53:01.792879 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"15645128-7001-44ca-b06c-25183ae8f5be","Type":"ContainerDied","Data":"2f3bf4700b0ea9fad7cafcf0b451453aa9a3f5aa8c6096ce1c9f5f1fec21007a"} Nov 26 22:53:01 crc kubenswrapper[5008]: I1126 22:53:01.825886 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-2" podStartSLOduration=7.799658591 podStartE2EDuration="15.825867362s" podCreationTimestamp="2025-11-26 22:52:46 +0000 UTC" firstStartedPulling="2025-11-26 22:52:48.363678371 +0000 UTC m=+843.776372373" lastFinishedPulling="2025-11-26 22:52:56.389887142 +0000 UTC m=+851.802581144" observedRunningTime="2025-11-26 22:53:01.819534043 +0000 UTC m=+857.232228045" watchObservedRunningTime="2025-11-26 22:53:01.825867362 +0000 UTC m=+857.238561364" Nov 26 22:53:01 crc kubenswrapper[5008]: I1126 22:53:01.950152 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qg52v"] Nov 26 22:53:01 crc kubenswrapper[5008]: I1126 22:53:01.951482 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:01 crc kubenswrapper[5008]: I1126 22:53:01.962676 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qg52v"] Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.127579 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f56947c6-1a64-4808-b3cb-7a5b13c45152-catalog-content\") pod \"redhat-operators-qg52v\" (UID: \"f56947c6-1a64-4808-b3cb-7a5b13c45152\") " pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.127625 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f56947c6-1a64-4808-b3cb-7a5b13c45152-utilities\") pod \"redhat-operators-qg52v\" (UID: \"f56947c6-1a64-4808-b3cb-7a5b13c45152\") " pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.127666 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5fnq\" (UniqueName: \"kubernetes.io/projected/f56947c6-1a64-4808-b3cb-7a5b13c45152-kube-api-access-w5fnq\") pod \"redhat-operators-qg52v\" (UID: \"f56947c6-1a64-4808-b3cb-7a5b13c45152\") " pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.228836 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f56947c6-1a64-4808-b3cb-7a5b13c45152-catalog-content\") pod \"redhat-operators-qg52v\" (UID: \"f56947c6-1a64-4808-b3cb-7a5b13c45152\") " pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.228882 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f56947c6-1a64-4808-b3cb-7a5b13c45152-utilities\") pod \"redhat-operators-qg52v\" (UID: \"f56947c6-1a64-4808-b3cb-7a5b13c45152\") " pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.228906 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5fnq\" (UniqueName: \"kubernetes.io/projected/f56947c6-1a64-4808-b3cb-7a5b13c45152-kube-api-access-w5fnq\") pod \"redhat-operators-qg52v\" (UID: \"f56947c6-1a64-4808-b3cb-7a5b13c45152\") " pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.229316 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f56947c6-1a64-4808-b3cb-7a5b13c45152-utilities\") pod \"redhat-operators-qg52v\" (UID: \"f56947c6-1a64-4808-b3cb-7a5b13c45152\") " pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.229549 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f56947c6-1a64-4808-b3cb-7a5b13c45152-catalog-content\") pod \"redhat-operators-qg52v\" (UID: \"f56947c6-1a64-4808-b3cb-7a5b13c45152\") " pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.244674 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5fnq\" (UniqueName: \"kubernetes.io/projected/f56947c6-1a64-4808-b3cb-7a5b13c45152-kube-api-access-w5fnq\") pod \"redhat-operators-qg52v\" (UID: \"f56947c6-1a64-4808-b3cb-7a5b13c45152\") " pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.281348 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.555786 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qg52v"] Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.799126 5008 generic.go:334] "Generic (PLEG): container finished" podID="0dd65e2a-85a1-40be-826a-d521a2e94607" containerID="2141655bac67028c7f9b7c54059a5f5565c8e88d467250abfe2ce3ae72bbe9f1" exitCode=0 Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.799196 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"0dd65e2a-85a1-40be-826a-d521a2e94607","Type":"ContainerDied","Data":"2141655bac67028c7f9b7c54059a5f5565c8e88d467250abfe2ce3ae72bbe9f1"} Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.801180 5008 generic.go:334] "Generic (PLEG): container finished" podID="f56947c6-1a64-4808-b3cb-7a5b13c45152" containerID="f14d0ebf977d4d8ec71df2e9008f1f4ad268e2e273662920cf3ab0324644fef2" exitCode=0 Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.801246 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg52v" event={"ID":"f56947c6-1a64-4808-b3cb-7a5b13c45152","Type":"ContainerDied","Data":"f14d0ebf977d4d8ec71df2e9008f1f4ad268e2e273662920cf3ab0324644fef2"} Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.801263 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg52v" event={"ID":"f56947c6-1a64-4808-b3cb-7a5b13c45152","Type":"ContainerStarted","Data":"9bbf4d0aa6e72f8c8a7933cc923b8900a6f5fa8fdfc963017ea9fa828108d755"} Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.803626 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"15645128-7001-44ca-b06c-25183ae8f5be","Type":"ContainerStarted","Data":"40e85e5c3b6bc4a6079f3c4397f75e99d2f8b0c234dda10868bdf01007a83ba6"} Nov 26 22:53:02 crc kubenswrapper[5008]: I1126 22:53:02.841055 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-0" podStartSLOduration=8.740357552 podStartE2EDuration="16.841039003s" podCreationTimestamp="2025-11-26 22:52:46 +0000 UTC" firstStartedPulling="2025-11-26 22:52:48.287616641 +0000 UTC m=+843.700310643" lastFinishedPulling="2025-11-26 22:52:56.388298102 +0000 UTC m=+851.800992094" observedRunningTime="2025-11-26 22:53:02.83743804 +0000 UTC m=+858.250132052" watchObservedRunningTime="2025-11-26 22:53:02.841039003 +0000 UTC m=+858.253733005" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.602803 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/memcached-0"] Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.603702 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.605250 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-gw8b6" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.605871 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.629472 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.757649 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpgqp\" (UniqueName: \"kubernetes.io/projected/7a5c5e3b-4219-4474-9626-90cd49264d18-kube-api-access-bpgqp\") pod \"memcached-0\" (UID: \"7a5c5e3b-4219-4474-9626-90cd49264d18\") " pod="glance-kuttl-tests/memcached-0" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.758015 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a5c5e3b-4219-4474-9626-90cd49264d18-config-data\") pod \"memcached-0\" (UID: \"7a5c5e3b-4219-4474-9626-90cd49264d18\") " pod="glance-kuttl-tests/memcached-0" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.758041 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a5c5e3b-4219-4474-9626-90cd49264d18-kolla-config\") pod \"memcached-0\" (UID: \"7a5c5e3b-4219-4474-9626-90cd49264d18\") " pod="glance-kuttl-tests/memcached-0" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.813252 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"0dd65e2a-85a1-40be-826a-d521a2e94607","Type":"ContainerStarted","Data":"c5c2de128c7f2f41bb11d903f2e678f05545b27abef81955e5f0a9da2f376b7a"} Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.817846 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg52v" event={"ID":"f56947c6-1a64-4808-b3cb-7a5b13c45152","Type":"ContainerStarted","Data":"c47aaefaa7139fd192e3c3cbc18e153c8c8c9efeec7701bde1d39468f2b412f0"} Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.859255 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpgqp\" (UniqueName: \"kubernetes.io/projected/7a5c5e3b-4219-4474-9626-90cd49264d18-kube-api-access-bpgqp\") pod \"memcached-0\" (UID: \"7a5c5e3b-4219-4474-9626-90cd49264d18\") " pod="glance-kuttl-tests/memcached-0" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.859333 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a5c5e3b-4219-4474-9626-90cd49264d18-config-data\") pod \"memcached-0\" (UID: \"7a5c5e3b-4219-4474-9626-90cd49264d18\") " pod="glance-kuttl-tests/memcached-0" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.859354 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a5c5e3b-4219-4474-9626-90cd49264d18-kolla-config\") pod \"memcached-0\" (UID: \"7a5c5e3b-4219-4474-9626-90cd49264d18\") " pod="glance-kuttl-tests/memcached-0" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.860286 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a5c5e3b-4219-4474-9626-90cd49264d18-kolla-config\") pod \"memcached-0\" (UID: \"7a5c5e3b-4219-4474-9626-90cd49264d18\") " pod="glance-kuttl-tests/memcached-0" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.860344 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a5c5e3b-4219-4474-9626-90cd49264d18-config-data\") pod \"memcached-0\" (UID: \"7a5c5e3b-4219-4474-9626-90cd49264d18\") " pod="glance-kuttl-tests/memcached-0" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.867726 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-1" podStartSLOduration=9.648701538 podStartE2EDuration="17.867711756s" podCreationTimestamp="2025-11-26 22:52:46 +0000 UTC" firstStartedPulling="2025-11-26 22:52:48.360885863 +0000 UTC m=+843.773579855" lastFinishedPulling="2025-11-26 22:52:56.579896071 +0000 UTC m=+851.992590073" observedRunningTime="2025-11-26 22:53:03.843363081 +0000 UTC m=+859.256057093" watchObservedRunningTime="2025-11-26 22:53:03.867711756 +0000 UTC m=+859.280405758" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.886817 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpgqp\" (UniqueName: \"kubernetes.io/projected/7a5c5e3b-4219-4474-9626-90cd49264d18-kube-api-access-bpgqp\") pod \"memcached-0\" (UID: \"7a5c5e3b-4219-4474-9626-90cd49264d18\") " pod="glance-kuttl-tests/memcached-0" Nov 26 22:53:03 crc kubenswrapper[5008]: I1126 22:53:03.933986 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Nov 26 22:53:04 crc kubenswrapper[5008]: I1126 22:53:04.392769 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Nov 26 22:53:04 crc kubenswrapper[5008]: W1126 22:53:04.435771 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a5c5e3b_4219_4474_9626_90cd49264d18.slice/crio-68eb527feb75ac6fcbf0c8ec4b4949e9b0e1217d47edd61b25673e80787c1fe2 WatchSource:0}: Error finding container 68eb527feb75ac6fcbf0c8ec4b4949e9b0e1217d47edd61b25673e80787c1fe2: Status 404 returned error can't find the container with id 68eb527feb75ac6fcbf0c8ec4b4949e9b0e1217d47edd61b25673e80787c1fe2 Nov 26 22:53:04 crc kubenswrapper[5008]: I1126 22:53:04.825615 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"7a5c5e3b-4219-4474-9626-90cd49264d18","Type":"ContainerStarted","Data":"68eb527feb75ac6fcbf0c8ec4b4949e9b0e1217d47edd61b25673e80787c1fe2"} Nov 26 22:53:04 crc kubenswrapper[5008]: I1126 22:53:04.827525 5008 generic.go:334] "Generic (PLEG): container finished" podID="f56947c6-1a64-4808-b3cb-7a5b13c45152" containerID="c47aaefaa7139fd192e3c3cbc18e153c8c8c9efeec7701bde1d39468f2b412f0" exitCode=0 Nov 26 22:53:04 crc kubenswrapper[5008]: I1126 22:53:04.827553 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg52v" event={"ID":"f56947c6-1a64-4808-b3cb-7a5b13c45152","Type":"ContainerDied","Data":"c47aaefaa7139fd192e3c3cbc18e153c8c8c9efeec7701bde1d39468f2b412f0"} Nov 26 22:53:05 crc kubenswrapper[5008]: I1126 22:53:05.845788 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg52v" event={"ID":"f56947c6-1a64-4808-b3cb-7a5b13c45152","Type":"ContainerStarted","Data":"fb18c16f86448725fb6a75a9ad71e6bc27e4c4326529a22c397721ecd6a58594"} Nov 26 22:53:05 crc kubenswrapper[5008]: I1126 22:53:05.862552 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qg52v" podStartSLOduration=2.357108956 podStartE2EDuration="4.862533903s" podCreationTimestamp="2025-11-26 22:53:01 +0000 UTC" firstStartedPulling="2025-11-26 22:53:02.802807182 +0000 UTC m=+858.215501184" lastFinishedPulling="2025-11-26 22:53:05.308232129 +0000 UTC m=+860.720926131" observedRunningTime="2025-11-26 22:53:05.859023682 +0000 UTC m=+861.271717694" watchObservedRunningTime="2025-11-26 22:53:05.862533903 +0000 UTC m=+861.275227915" Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.335333 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-9cglp"] Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.337157 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-9cglp" Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.340158 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-mmw8w" Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.343461 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-9cglp"] Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.506837 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fphpt\" (UniqueName: \"kubernetes.io/projected/26913e68-13a0-4ccd-8d66-19922a16e068-kube-api-access-fphpt\") pod \"rabbitmq-cluster-operator-index-9cglp\" (UID: \"26913e68-13a0-4ccd-8d66-19922a16e068\") " pod="openstack-operators/rabbitmq-cluster-operator-index-9cglp" Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.608221 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fphpt\" (UniqueName: \"kubernetes.io/projected/26913e68-13a0-4ccd-8d66-19922a16e068-kube-api-access-fphpt\") pod \"rabbitmq-cluster-operator-index-9cglp\" (UID: \"26913e68-13a0-4ccd-8d66-19922a16e068\") " pod="openstack-operators/rabbitmq-cluster-operator-index-9cglp" Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.648995 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fphpt\" (UniqueName: \"kubernetes.io/projected/26913e68-13a0-4ccd-8d66-19922a16e068-kube-api-access-fphpt\") pod \"rabbitmq-cluster-operator-index-9cglp\" (UID: \"26913e68-13a0-4ccd-8d66-19922a16e068\") " pod="openstack-operators/rabbitmq-cluster-operator-index-9cglp" Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.928702 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.929688 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.929903 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.930498 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.930609 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.930756 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-9cglp" Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.931869 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.949617 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"7a5c5e3b-4219-4474-9626-90cd49264d18","Type":"ContainerStarted","Data":"847369ca4f81a50d949dd18521dc27c76b50c6c5897e0e34b952640d66afd402"} Nov 26 22:53:07 crc kubenswrapper[5008]: I1126 22:53:07.970437 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/memcached-0" podStartSLOduration=2.620653953 podStartE2EDuration="4.970419221s" podCreationTimestamp="2025-11-26 22:53:03 +0000 UTC" firstStartedPulling="2025-11-26 22:53:04.438095714 +0000 UTC m=+859.850789716" lastFinishedPulling="2025-11-26 22:53:06.787860982 +0000 UTC m=+862.200554984" observedRunningTime="2025-11-26 22:53:07.968283754 +0000 UTC m=+863.380977756" watchObservedRunningTime="2025-11-26 22:53:07.970419221 +0000 UTC m=+863.383113223" Nov 26 22:53:08 crc kubenswrapper[5008]: I1126 22:53:08.386724 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-9cglp"] Nov 26 22:53:08 crc kubenswrapper[5008]: W1126 22:53:08.394751 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26913e68_13a0_4ccd_8d66_19922a16e068.slice/crio-e526df3e3bee182aa3290d11fe4433a39223eccd1f7d2cf175e5a07624723df4 WatchSource:0}: Error finding container e526df3e3bee182aa3290d11fe4433a39223eccd1f7d2cf175e5a07624723df4: Status 404 returned error can't find the container with id e526df3e3bee182aa3290d11fe4433a39223eccd1f7d2cf175e5a07624723df4 Nov 26 22:53:08 crc kubenswrapper[5008]: I1126 22:53:08.934105 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/memcached-0" Nov 26 22:53:08 crc kubenswrapper[5008]: I1126 22:53:08.958753 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-9cglp" event={"ID":"26913e68-13a0-4ccd-8d66-19922a16e068","Type":"ContainerStarted","Data":"e526df3e3bee182aa3290d11fe4433a39223eccd1f7d2cf175e5a07624723df4"} Nov 26 22:53:10 crc kubenswrapper[5008]: E1126 22:53:10.615461 5008 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.166:59714->38.102.83.166:42151: write tcp 38.102.83.166:59714->38.102.83.166:42151: write: broken pipe Nov 26 22:53:12 crc kubenswrapper[5008]: I1126 22:53:12.282390 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:12 crc kubenswrapper[5008]: I1126 22:53:12.282862 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:12 crc kubenswrapper[5008]: I1126 22:53:12.355376 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:12 crc kubenswrapper[5008]: I1126 22:53:12.361808 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:53:12 crc kubenswrapper[5008]: I1126 22:53:12.473896 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 22:53:12 crc kubenswrapper[5008]: E1126 22:53:12.659176 5008 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.166:59742->38.102.83.166:42151: write tcp 38.102.83.166:59742->38.102.83.166:42151: write: connection reset by peer Nov 26 22:53:12 crc kubenswrapper[5008]: E1126 22:53:12.752666 5008 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.166:59760->38.102.83.166:42151: write tcp 38.102.83.166:59760->38.102.83.166:42151: write: connection reset by peer Nov 26 22:53:12 crc kubenswrapper[5008]: E1126 22:53:12.834343 5008 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.166:59766->38.102.83.166:42151: write tcp 38.102.83.166:59766->38.102.83.166:42151: write: connection reset by peer Nov 26 22:53:12 crc kubenswrapper[5008]: I1126 22:53:12.995684 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-9cglp" event={"ID":"26913e68-13a0-4ccd-8d66-19922a16e068","Type":"ContainerStarted","Data":"7038f2db4beeb8c45365901063a0423e4c997a12f68ba92fac606be98e910c14"} Nov 26 22:53:13 crc kubenswrapper[5008]: I1126 22:53:13.022772 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-9cglp" podStartSLOduration=2.375202193 podStartE2EDuration="6.02275081s" podCreationTimestamp="2025-11-26 22:53:07 +0000 UTC" firstStartedPulling="2025-11-26 22:53:08.397216999 +0000 UTC m=+863.809911001" lastFinishedPulling="2025-11-26 22:53:12.044765616 +0000 UTC m=+867.457459618" observedRunningTime="2025-11-26 22:53:13.01959714 +0000 UTC m=+868.432291182" watchObservedRunningTime="2025-11-26 22:53:13.02275081 +0000 UTC m=+868.435444822" Nov 26 22:53:13 crc kubenswrapper[5008]: I1126 22:53:13.044556 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:13 crc kubenswrapper[5008]: I1126 22:53:13.935068 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/memcached-0" Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.122744 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gchxt"] Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.126017 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.135311 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gchxt"] Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.210374 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkm6x\" (UniqueName: \"kubernetes.io/projected/94c69ad0-d741-4198-978f-b688fa003f2e-kube-api-access-gkm6x\") pod \"community-operators-gchxt\" (UID: \"94c69ad0-d741-4198-978f-b688fa003f2e\") " pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.210447 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c69ad0-d741-4198-978f-b688fa003f2e-catalog-content\") pod \"community-operators-gchxt\" (UID: \"94c69ad0-d741-4198-978f-b688fa003f2e\") " pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.210595 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c69ad0-d741-4198-978f-b688fa003f2e-utilities\") pod \"community-operators-gchxt\" (UID: \"94c69ad0-d741-4198-978f-b688fa003f2e\") " pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.311720 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkm6x\" (UniqueName: \"kubernetes.io/projected/94c69ad0-d741-4198-978f-b688fa003f2e-kube-api-access-gkm6x\") pod \"community-operators-gchxt\" (UID: \"94c69ad0-d741-4198-978f-b688fa003f2e\") " pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.311797 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c69ad0-d741-4198-978f-b688fa003f2e-catalog-content\") pod \"community-operators-gchxt\" (UID: \"94c69ad0-d741-4198-978f-b688fa003f2e\") " pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.311860 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c69ad0-d741-4198-978f-b688fa003f2e-utilities\") pod \"community-operators-gchxt\" (UID: \"94c69ad0-d741-4198-978f-b688fa003f2e\") " pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.312479 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c69ad0-d741-4198-978f-b688fa003f2e-utilities\") pod \"community-operators-gchxt\" (UID: \"94c69ad0-d741-4198-978f-b688fa003f2e\") " pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.312653 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c69ad0-d741-4198-978f-b688fa003f2e-catalog-content\") pod \"community-operators-gchxt\" (UID: \"94c69ad0-d741-4198-978f-b688fa003f2e\") " pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.338352 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkm6x\" (UniqueName: \"kubernetes.io/projected/94c69ad0-d741-4198-978f-b688fa003f2e-kube-api-access-gkm6x\") pod \"community-operators-gchxt\" (UID: \"94c69ad0-d741-4198-978f-b688fa003f2e\") " pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.450234 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.516398 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qg52v"] Nov 26 22:53:14 crc kubenswrapper[5008]: I1126 22:53:14.893669 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gchxt"] Nov 26 22:53:15 crc kubenswrapper[5008]: I1126 22:53:15.018782 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gchxt" event={"ID":"94c69ad0-d741-4198-978f-b688fa003f2e","Type":"ContainerStarted","Data":"ff3a173a73613e98b233a43e67bfd2ebe5276f2cd6711608befee9508ff300df"} Nov 26 22:53:15 crc kubenswrapper[5008]: I1126 22:53:15.018972 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qg52v" podUID="f56947c6-1a64-4808-b3cb-7a5b13c45152" containerName="registry-server" containerID="cri-o://fb18c16f86448725fb6a75a9ad71e6bc27e4c4326529a22c397721ecd6a58594" gracePeriod=2 Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.027353 5008 generic.go:334] "Generic (PLEG): container finished" podID="f56947c6-1a64-4808-b3cb-7a5b13c45152" containerID="fb18c16f86448725fb6a75a9ad71e6bc27e4c4326529a22c397721ecd6a58594" exitCode=0 Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.027594 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg52v" event={"ID":"f56947c6-1a64-4808-b3cb-7a5b13c45152","Type":"ContainerDied","Data":"fb18c16f86448725fb6a75a9ad71e6bc27e4c4326529a22c397721ecd6a58594"} Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.030397 5008 generic.go:334] "Generic (PLEG): container finished" podID="94c69ad0-d741-4198-978f-b688fa003f2e" containerID="bea282cf8dcd4c0055290fb025b063994976e7adbc4d0ee172bf23bf5876f2f7" exitCode=0 Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.030418 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gchxt" event={"ID":"94c69ad0-d741-4198-978f-b688fa003f2e","Type":"ContainerDied","Data":"bea282cf8dcd4c0055290fb025b063994976e7adbc4d0ee172bf23bf5876f2f7"} Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.125438 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.234306 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f56947c6-1a64-4808-b3cb-7a5b13c45152-catalog-content\") pod \"f56947c6-1a64-4808-b3cb-7a5b13c45152\" (UID: \"f56947c6-1a64-4808-b3cb-7a5b13c45152\") " Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.234440 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f56947c6-1a64-4808-b3cb-7a5b13c45152-utilities\") pod \"f56947c6-1a64-4808-b3cb-7a5b13c45152\" (UID: \"f56947c6-1a64-4808-b3cb-7a5b13c45152\") " Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.234487 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5fnq\" (UniqueName: \"kubernetes.io/projected/f56947c6-1a64-4808-b3cb-7a5b13c45152-kube-api-access-w5fnq\") pod \"f56947c6-1a64-4808-b3cb-7a5b13c45152\" (UID: \"f56947c6-1a64-4808-b3cb-7a5b13c45152\") " Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.235358 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f56947c6-1a64-4808-b3cb-7a5b13c45152-utilities" (OuterVolumeSpecName: "utilities") pod "f56947c6-1a64-4808-b3cb-7a5b13c45152" (UID: "f56947c6-1a64-4808-b3cb-7a5b13c45152"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.244253 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f56947c6-1a64-4808-b3cb-7a5b13c45152-kube-api-access-w5fnq" (OuterVolumeSpecName: "kube-api-access-w5fnq") pod "f56947c6-1a64-4808-b3cb-7a5b13c45152" (UID: "f56947c6-1a64-4808-b3cb-7a5b13c45152"). InnerVolumeSpecName "kube-api-access-w5fnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.318467 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f56947c6-1a64-4808-b3cb-7a5b13c45152-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f56947c6-1a64-4808-b3cb-7a5b13c45152" (UID: "f56947c6-1a64-4808-b3cb-7a5b13c45152"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.335607 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5fnq\" (UniqueName: \"kubernetes.io/projected/f56947c6-1a64-4808-b3cb-7a5b13c45152-kube-api-access-w5fnq\") on node \"crc\" DevicePath \"\"" Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.335799 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f56947c6-1a64-4808-b3cb-7a5b13c45152-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:53:16 crc kubenswrapper[5008]: I1126 22:53:16.335867 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f56947c6-1a64-4808-b3cb-7a5b13c45152-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:53:17 crc kubenswrapper[5008]: I1126 22:53:17.036760 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg52v" event={"ID":"f56947c6-1a64-4808-b3cb-7a5b13c45152","Type":"ContainerDied","Data":"9bbf4d0aa6e72f8c8a7933cc923b8900a6f5fa8fdfc963017ea9fa828108d755"} Nov 26 22:53:17 crc kubenswrapper[5008]: I1126 22:53:17.037056 5008 scope.go:117] "RemoveContainer" containerID="fb18c16f86448725fb6a75a9ad71e6bc27e4c4326529a22c397721ecd6a58594" Nov 26 22:53:17 crc kubenswrapper[5008]: I1126 22:53:17.036811 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qg52v" Nov 26 22:53:17 crc kubenswrapper[5008]: I1126 22:53:17.079952 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qg52v"] Nov 26 22:53:17 crc kubenswrapper[5008]: I1126 22:53:17.086792 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qg52v"] Nov 26 22:53:17 crc kubenswrapper[5008]: I1126 22:53:17.527541 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f56947c6-1a64-4808-b3cb-7a5b13c45152" path="/var/lib/kubelet/pods/f56947c6-1a64-4808-b3cb-7a5b13c45152/volumes" Nov 26 22:53:17 crc kubenswrapper[5008]: I1126 22:53:17.931538 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-9cglp" Nov 26 22:53:17 crc kubenswrapper[5008]: I1126 22:53:17.931632 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-9cglp" Nov 26 22:53:17 crc kubenswrapper[5008]: I1126 22:53:17.961254 5008 scope.go:117] "RemoveContainer" containerID="c47aaefaa7139fd192e3c3cbc18e153c8c8c9efeec7701bde1d39468f2b412f0" Nov 26 22:53:17 crc kubenswrapper[5008]: I1126 22:53:17.972282 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-9cglp" Nov 26 22:53:17 crc kubenswrapper[5008]: I1126 22:53:17.985040 5008 scope.go:117] "RemoveContainer" containerID="f14d0ebf977d4d8ec71df2e9008f1f4ad268e2e273662920cf3ab0324644fef2" Nov 26 22:53:18 crc kubenswrapper[5008]: I1126 22:53:18.068707 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-9cglp" Nov 26 22:53:18 crc kubenswrapper[5008]: I1126 22:53:18.075355 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/openstack-galera-2" podUID="43262a3b-7a09-4b65-9df6-2bfe519feff5" containerName="galera" probeResult="failure" output=< Nov 26 22:53:18 crc kubenswrapper[5008]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Nov 26 22:53:18 crc kubenswrapper[5008]: > Nov 26 22:53:20 crc kubenswrapper[5008]: E1126 22:53:20.148682 5008 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.166:45930->38.102.83.166:42151: write tcp 38.102.83.166:45930->38.102.83.166:42151: write: broken pipe Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.371826 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7"] Nov 26 22:53:20 crc kubenswrapper[5008]: E1126 22:53:20.372132 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f56947c6-1a64-4808-b3cb-7a5b13c45152" containerName="extract-utilities" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.372154 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f56947c6-1a64-4808-b3cb-7a5b13c45152" containerName="extract-utilities" Nov 26 22:53:20 crc kubenswrapper[5008]: E1126 22:53:20.372170 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f56947c6-1a64-4808-b3cb-7a5b13c45152" containerName="extract-content" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.372178 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f56947c6-1a64-4808-b3cb-7a5b13c45152" containerName="extract-content" Nov 26 22:53:20 crc kubenswrapper[5008]: E1126 22:53:20.372190 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f56947c6-1a64-4808-b3cb-7a5b13c45152" containerName="registry-server" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.372199 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f56947c6-1a64-4808-b3cb-7a5b13c45152" containerName="registry-server" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.372360 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f56947c6-1a64-4808-b3cb-7a5b13c45152" containerName="registry-server" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.373343 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.375413 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6kdt6" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.382002 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7"] Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.501897 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7\" (UID: \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.501945 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7\" (UID: \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.502010 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75s4v\" (UniqueName: \"kubernetes.io/projected/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-kube-api-access-75s4v\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7\" (UID: \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.603673 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75s4v\" (UniqueName: \"kubernetes.io/projected/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-kube-api-access-75s4v\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7\" (UID: \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.603727 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7\" (UID: \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.603751 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7\" (UID: \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.605284 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7\" (UID: \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.605566 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7\" (UID: \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.623580 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75s4v\" (UniqueName: \"kubernetes.io/projected/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-kube-api-access-75s4v\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7\" (UID: \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" Nov 26 22:53:20 crc kubenswrapper[5008]: I1126 22:53:20.689511 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" Nov 26 22:53:22 crc kubenswrapper[5008]: I1126 22:53:22.734176 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7"] Nov 26 22:53:23 crc kubenswrapper[5008]: I1126 22:53:23.077676 5008 generic.go:334] "Generic (PLEG): container finished" podID="94c69ad0-d741-4198-978f-b688fa003f2e" containerID="01faaeb45d6551f68782301eb4dbbcb3fd128c396937ffa09c0a866e14ce2098" exitCode=0 Nov 26 22:53:23 crc kubenswrapper[5008]: I1126 22:53:23.077771 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gchxt" event={"ID":"94c69ad0-d741-4198-978f-b688fa003f2e","Type":"ContainerDied","Data":"01faaeb45d6551f68782301eb4dbbcb3fd128c396937ffa09c0a866e14ce2098"} Nov 26 22:53:23 crc kubenswrapper[5008]: I1126 22:53:23.080198 5008 generic.go:334] "Generic (PLEG): container finished" podID="f7687a56-2af0-49c5-a05f-32f3b3b73fb2" containerID="7be8276d3bda1b8adf52099d8bdeb929f9d0fad65b192b609ae2e4d3dac4f91a" exitCode=0 Nov 26 22:53:23 crc kubenswrapper[5008]: I1126 22:53:23.080254 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" event={"ID":"f7687a56-2af0-49c5-a05f-32f3b3b73fb2","Type":"ContainerDied","Data":"7be8276d3bda1b8adf52099d8bdeb929f9d0fad65b192b609ae2e4d3dac4f91a"} Nov 26 22:53:23 crc kubenswrapper[5008]: I1126 22:53:23.080305 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" event={"ID":"f7687a56-2af0-49c5-a05f-32f3b3b73fb2","Type":"ContainerStarted","Data":"cbb25e85630d16b914b3eb80cb3b7e7bd2aea0917d007fdec0a88a9555bf59d7"} Nov 26 22:53:24 crc kubenswrapper[5008]: I1126 22:53:24.093435 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gchxt" event={"ID":"94c69ad0-d741-4198-978f-b688fa003f2e","Type":"ContainerStarted","Data":"fb63581198d37d4cae90ca2f31e9b38d4ae004e86d13284f52bc6b8f2fbc88fc"} Nov 26 22:53:24 crc kubenswrapper[5008]: I1126 22:53:24.115715 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gchxt" podStartSLOduration=2.508839343 podStartE2EDuration="10.115692681s" podCreationTimestamp="2025-11-26 22:53:14 +0000 UTC" firstStartedPulling="2025-11-26 22:53:16.031951223 +0000 UTC m=+871.444645235" lastFinishedPulling="2025-11-26 22:53:23.638804541 +0000 UTC m=+879.051498573" observedRunningTime="2025-11-26 22:53:24.115577208 +0000 UTC m=+879.528271240" watchObservedRunningTime="2025-11-26 22:53:24.115692681 +0000 UTC m=+879.528386713" Nov 26 22:53:24 crc kubenswrapper[5008]: I1126 22:53:24.448587 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:53:24 crc kubenswrapper[5008]: I1126 22:53:24.450713 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:24 crc kubenswrapper[5008]: I1126 22:53:24.450802 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:24 crc kubenswrapper[5008]: I1126 22:53:24.560189 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 22:53:25 crc kubenswrapper[5008]: I1126 22:53:25.101422 5008 generic.go:334] "Generic (PLEG): container finished" podID="f7687a56-2af0-49c5-a05f-32f3b3b73fb2" containerID="f89d7eb533033a8ff9b613c39a152e1c964d7c78f8211173ba9bd0f1d9565774" exitCode=0 Nov 26 22:53:25 crc kubenswrapper[5008]: I1126 22:53:25.101499 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" event={"ID":"f7687a56-2af0-49c5-a05f-32f3b3b73fb2","Type":"ContainerDied","Data":"f89d7eb533033a8ff9b613c39a152e1c964d7c78f8211173ba9bd0f1d9565774"} Nov 26 22:53:25 crc kubenswrapper[5008]: I1126 22:53:25.496562 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-gchxt" podUID="94c69ad0-d741-4198-978f-b688fa003f2e" containerName="registry-server" probeResult="failure" output=< Nov 26 22:53:25 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Nov 26 22:53:25 crc kubenswrapper[5008]: > Nov 26 22:53:26 crc kubenswrapper[5008]: I1126 22:53:26.121063 5008 generic.go:334] "Generic (PLEG): container finished" podID="f7687a56-2af0-49c5-a05f-32f3b3b73fb2" containerID="39cb82172745579650105ccffb5e1c3b0a49756a8abc734b13ee99d47adb9dde" exitCode=0 Nov 26 22:53:26 crc kubenswrapper[5008]: I1126 22:53:26.121204 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" event={"ID":"f7687a56-2af0-49c5-a05f-32f3b3b73fb2","Type":"ContainerDied","Data":"39cb82172745579650105ccffb5e1c3b0a49756a8abc734b13ee99d47adb9dde"} Nov 26 22:53:27 crc kubenswrapper[5008]: I1126 22:53:27.084127 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:53:27 crc kubenswrapper[5008]: I1126 22:53:27.195980 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 22:53:27 crc kubenswrapper[5008]: I1126 22:53:27.452360 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" Nov 26 22:53:27 crc kubenswrapper[5008]: I1126 22:53:27.509033 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-bundle\") pod \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\" (UID: \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\") " Nov 26 22:53:27 crc kubenswrapper[5008]: I1126 22:53:27.509227 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75s4v\" (UniqueName: \"kubernetes.io/projected/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-kube-api-access-75s4v\") pod \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\" (UID: \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\") " Nov 26 22:53:27 crc kubenswrapper[5008]: I1126 22:53:27.509263 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-util\") pod \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\" (UID: \"f7687a56-2af0-49c5-a05f-32f3b3b73fb2\") " Nov 26 22:53:27 crc kubenswrapper[5008]: I1126 22:53:27.510267 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-bundle" (OuterVolumeSpecName: "bundle") pod "f7687a56-2af0-49c5-a05f-32f3b3b73fb2" (UID: "f7687a56-2af0-49c5-a05f-32f3b3b73fb2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:53:27 crc kubenswrapper[5008]: I1126 22:53:27.514284 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-kube-api-access-75s4v" (OuterVolumeSpecName: "kube-api-access-75s4v") pod "f7687a56-2af0-49c5-a05f-32f3b3b73fb2" (UID: "f7687a56-2af0-49c5-a05f-32f3b3b73fb2"). InnerVolumeSpecName "kube-api-access-75s4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:53:27 crc kubenswrapper[5008]: I1126 22:53:27.530764 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-util" (OuterVolumeSpecName: "util") pod "f7687a56-2af0-49c5-a05f-32f3b3b73fb2" (UID: "f7687a56-2af0-49c5-a05f-32f3b3b73fb2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:53:27 crc kubenswrapper[5008]: I1126 22:53:27.611471 5008 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:53:27 crc kubenswrapper[5008]: I1126 22:53:27.611517 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75s4v\" (UniqueName: \"kubernetes.io/projected/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-kube-api-access-75s4v\") on node \"crc\" DevicePath \"\"" Nov 26 22:53:27 crc kubenswrapper[5008]: I1126 22:53:27.611532 5008 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7687a56-2af0-49c5-a05f-32f3b3b73fb2-util\") on node \"crc\" DevicePath \"\"" Nov 26 22:53:28 crc kubenswrapper[5008]: I1126 22:53:28.139860 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" event={"ID":"f7687a56-2af0-49c5-a05f-32f3b3b73fb2","Type":"ContainerDied","Data":"cbb25e85630d16b914b3eb80cb3b7e7bd2aea0917d007fdec0a88a9555bf59d7"} Nov 26 22:53:28 crc kubenswrapper[5008]: I1126 22:53:28.139903 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590hmcl7" Nov 26 22:53:28 crc kubenswrapper[5008]: I1126 22:53:28.139933 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbb25e85630d16b914b3eb80cb3b7e7bd2aea0917d007fdec0a88a9555bf59d7" Nov 26 22:53:34 crc kubenswrapper[5008]: I1126 22:53:34.506915 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:34 crc kubenswrapper[5008]: I1126 22:53:34.584055 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gchxt" Nov 26 22:53:36 crc kubenswrapper[5008]: I1126 22:53:36.109134 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4"] Nov 26 22:53:36 crc kubenswrapper[5008]: E1126 22:53:36.109457 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7687a56-2af0-49c5-a05f-32f3b3b73fb2" containerName="extract" Nov 26 22:53:36 crc kubenswrapper[5008]: I1126 22:53:36.109478 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7687a56-2af0-49c5-a05f-32f3b3b73fb2" containerName="extract" Nov 26 22:53:36 crc kubenswrapper[5008]: E1126 22:53:36.109499 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7687a56-2af0-49c5-a05f-32f3b3b73fb2" containerName="pull" Nov 26 22:53:36 crc kubenswrapper[5008]: I1126 22:53:36.109511 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7687a56-2af0-49c5-a05f-32f3b3b73fb2" containerName="pull" Nov 26 22:53:36 crc kubenswrapper[5008]: E1126 22:53:36.109529 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7687a56-2af0-49c5-a05f-32f3b3b73fb2" containerName="util" Nov 26 22:53:36 crc kubenswrapper[5008]: I1126 22:53:36.109540 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7687a56-2af0-49c5-a05f-32f3b3b73fb2" containerName="util" Nov 26 22:53:36 crc kubenswrapper[5008]: I1126 22:53:36.109745 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7687a56-2af0-49c5-a05f-32f3b3b73fb2" containerName="extract" Nov 26 22:53:36 crc kubenswrapper[5008]: I1126 22:53:36.110427 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" Nov 26 22:53:36 crc kubenswrapper[5008]: I1126 22:53:36.115804 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-8xvhw" Nov 26 22:53:36 crc kubenswrapper[5008]: I1126 22:53:36.123323 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4"] Nov 26 22:53:36 crc kubenswrapper[5008]: I1126 22:53:36.254208 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vd5g\" (UniqueName: \"kubernetes.io/projected/d0c58bd4-2da5-4770-b562-aad453776b10-kube-api-access-9vd5g\") pod \"rabbitmq-cluster-operator-779fc9694b-hg8m4\" (UID: \"d0c58bd4-2da5-4770-b562-aad453776b10\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" Nov 26 22:53:36 crc kubenswrapper[5008]: I1126 22:53:36.355649 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vd5g\" (UniqueName: \"kubernetes.io/projected/d0c58bd4-2da5-4770-b562-aad453776b10-kube-api-access-9vd5g\") pod \"rabbitmq-cluster-operator-779fc9694b-hg8m4\" (UID: \"d0c58bd4-2da5-4770-b562-aad453776b10\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" Nov 26 22:53:36 crc kubenswrapper[5008]: I1126 22:53:36.379766 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vd5g\" (UniqueName: \"kubernetes.io/projected/d0c58bd4-2da5-4770-b562-aad453776b10-kube-api-access-9vd5g\") pod \"rabbitmq-cluster-operator-779fc9694b-hg8m4\" (UID: \"d0c58bd4-2da5-4770-b562-aad453776b10\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" Nov 26 22:53:36 crc kubenswrapper[5008]: I1126 22:53:36.433467 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" Nov 26 22:53:36 crc kubenswrapper[5008]: I1126 22:53:36.897320 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4"] Nov 26 22:53:37 crc kubenswrapper[5008]: I1126 22:53:37.212596 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" event={"ID":"d0c58bd4-2da5-4770-b562-aad453776b10","Type":"ContainerStarted","Data":"a39fdd5ce1a29a3baf76983202750c9580733f44efdc8a2f7a9af04715263b8d"} Nov 26 22:53:37 crc kubenswrapper[5008]: I1126 22:53:37.781487 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gchxt"] Nov 26 22:53:38 crc kubenswrapper[5008]: I1126 22:53:38.327291 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zsc96"] Nov 26 22:53:38 crc kubenswrapper[5008]: I1126 22:53:38.329387 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zsc96" podUID="af763e0c-bae0-4cd8-abcc-980ee5edd7ab" containerName="registry-server" containerID="cri-o://d4370799d409ca2891abe3bc35cc275e53298ca04124beba8626e0fde1bcbc64" gracePeriod=2 Nov 26 22:53:38 crc kubenswrapper[5008]: I1126 22:53:38.704714 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:53:38 crc kubenswrapper[5008]: I1126 22:53:38.799023 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75qbh\" (UniqueName: \"kubernetes.io/projected/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-kube-api-access-75qbh\") pod \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\" (UID: \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\") " Nov 26 22:53:38 crc kubenswrapper[5008]: I1126 22:53:38.799061 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-catalog-content\") pod \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\" (UID: \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\") " Nov 26 22:53:38 crc kubenswrapper[5008]: I1126 22:53:38.799093 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-utilities\") pod \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\" (UID: \"af763e0c-bae0-4cd8-abcc-980ee5edd7ab\") " Nov 26 22:53:38 crc kubenswrapper[5008]: I1126 22:53:38.800040 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-utilities" (OuterVolumeSpecName: "utilities") pod "af763e0c-bae0-4cd8-abcc-980ee5edd7ab" (UID: "af763e0c-bae0-4cd8-abcc-980ee5edd7ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:53:38 crc kubenswrapper[5008]: I1126 22:53:38.817561 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-kube-api-access-75qbh" (OuterVolumeSpecName: "kube-api-access-75qbh") pod "af763e0c-bae0-4cd8-abcc-980ee5edd7ab" (UID: "af763e0c-bae0-4cd8-abcc-980ee5edd7ab"). InnerVolumeSpecName "kube-api-access-75qbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:53:38 crc kubenswrapper[5008]: I1126 22:53:38.880723 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af763e0c-bae0-4cd8-abcc-980ee5edd7ab" (UID: "af763e0c-bae0-4cd8-abcc-980ee5edd7ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:53:38 crc kubenswrapper[5008]: I1126 22:53:38.900441 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:53:38 crc kubenswrapper[5008]: I1126 22:53:38.900477 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:53:38 crc kubenswrapper[5008]: I1126 22:53:38.900490 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75qbh\" (UniqueName: \"kubernetes.io/projected/af763e0c-bae0-4cd8-abcc-980ee5edd7ab-kube-api-access-75qbh\") on node \"crc\" DevicePath \"\"" Nov 26 22:53:39 crc kubenswrapper[5008]: I1126 22:53:39.226394 5008 generic.go:334] "Generic (PLEG): container finished" podID="af763e0c-bae0-4cd8-abcc-980ee5edd7ab" containerID="d4370799d409ca2891abe3bc35cc275e53298ca04124beba8626e0fde1bcbc64" exitCode=0 Nov 26 22:53:39 crc kubenswrapper[5008]: I1126 22:53:39.226437 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsc96" event={"ID":"af763e0c-bae0-4cd8-abcc-980ee5edd7ab","Type":"ContainerDied","Data":"d4370799d409ca2891abe3bc35cc275e53298ca04124beba8626e0fde1bcbc64"} Nov 26 22:53:39 crc kubenswrapper[5008]: I1126 22:53:39.226501 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsc96" event={"ID":"af763e0c-bae0-4cd8-abcc-980ee5edd7ab","Type":"ContainerDied","Data":"933d3ea1bc273567625a99c48172fda773100032d774cc1a3dfb9a95e9f33389"} Nov 26 22:53:39 crc kubenswrapper[5008]: I1126 22:53:39.226501 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zsc96" Nov 26 22:53:39 crc kubenswrapper[5008]: I1126 22:53:39.226519 5008 scope.go:117] "RemoveContainer" containerID="d4370799d409ca2891abe3bc35cc275e53298ca04124beba8626e0fde1bcbc64" Nov 26 22:53:39 crc kubenswrapper[5008]: I1126 22:53:39.261386 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zsc96"] Nov 26 22:53:39 crc kubenswrapper[5008]: I1126 22:53:39.263975 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zsc96"] Nov 26 22:53:39 crc kubenswrapper[5008]: I1126 22:53:39.528660 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af763e0c-bae0-4cd8-abcc-980ee5edd7ab" path="/var/lib/kubelet/pods/af763e0c-bae0-4cd8-abcc-980ee5edd7ab/volumes" Nov 26 22:53:40 crc kubenswrapper[5008]: I1126 22:53:40.711928 5008 scope.go:117] "RemoveContainer" containerID="87d8d251488bc8ae419207315781c81159ec4469ecc29eff55387d9a6854d806" Nov 26 22:53:40 crc kubenswrapper[5008]: I1126 22:53:40.749377 5008 scope.go:117] "RemoveContainer" containerID="5ff97b82db43d48d99c4dfe578691ec81f3b5897b875c8b97e63fa6a50becdcb" Nov 26 22:53:40 crc kubenswrapper[5008]: I1126 22:53:40.774814 5008 scope.go:117] "RemoveContainer" containerID="d4370799d409ca2891abe3bc35cc275e53298ca04124beba8626e0fde1bcbc64" Nov 26 22:53:40 crc kubenswrapper[5008]: E1126 22:53:40.775225 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4370799d409ca2891abe3bc35cc275e53298ca04124beba8626e0fde1bcbc64\": container with ID starting with d4370799d409ca2891abe3bc35cc275e53298ca04124beba8626e0fde1bcbc64 not found: ID does not exist" containerID="d4370799d409ca2891abe3bc35cc275e53298ca04124beba8626e0fde1bcbc64" Nov 26 22:53:40 crc kubenswrapper[5008]: I1126 22:53:40.775261 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4370799d409ca2891abe3bc35cc275e53298ca04124beba8626e0fde1bcbc64"} err="failed to get container status \"d4370799d409ca2891abe3bc35cc275e53298ca04124beba8626e0fde1bcbc64\": rpc error: code = NotFound desc = could not find container \"d4370799d409ca2891abe3bc35cc275e53298ca04124beba8626e0fde1bcbc64\": container with ID starting with d4370799d409ca2891abe3bc35cc275e53298ca04124beba8626e0fde1bcbc64 not found: ID does not exist" Nov 26 22:53:40 crc kubenswrapper[5008]: I1126 22:53:40.775286 5008 scope.go:117] "RemoveContainer" containerID="87d8d251488bc8ae419207315781c81159ec4469ecc29eff55387d9a6854d806" Nov 26 22:53:40 crc kubenswrapper[5008]: E1126 22:53:40.775712 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87d8d251488bc8ae419207315781c81159ec4469ecc29eff55387d9a6854d806\": container with ID starting with 87d8d251488bc8ae419207315781c81159ec4469ecc29eff55387d9a6854d806 not found: ID does not exist" containerID="87d8d251488bc8ae419207315781c81159ec4469ecc29eff55387d9a6854d806" Nov 26 22:53:40 crc kubenswrapper[5008]: I1126 22:53:40.775746 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d8d251488bc8ae419207315781c81159ec4469ecc29eff55387d9a6854d806"} err="failed to get container status \"87d8d251488bc8ae419207315781c81159ec4469ecc29eff55387d9a6854d806\": rpc error: code = NotFound desc = could not find container \"87d8d251488bc8ae419207315781c81159ec4469ecc29eff55387d9a6854d806\": container with ID starting with 87d8d251488bc8ae419207315781c81159ec4469ecc29eff55387d9a6854d806 not found: ID does not exist" Nov 26 22:53:40 crc kubenswrapper[5008]: I1126 22:53:40.775765 5008 scope.go:117] "RemoveContainer" containerID="5ff97b82db43d48d99c4dfe578691ec81f3b5897b875c8b97e63fa6a50becdcb" Nov 26 22:53:40 crc kubenswrapper[5008]: E1126 22:53:40.776107 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff97b82db43d48d99c4dfe578691ec81f3b5897b875c8b97e63fa6a50becdcb\": container with ID starting with 5ff97b82db43d48d99c4dfe578691ec81f3b5897b875c8b97e63fa6a50becdcb not found: ID does not exist" containerID="5ff97b82db43d48d99c4dfe578691ec81f3b5897b875c8b97e63fa6a50becdcb" Nov 26 22:53:40 crc kubenswrapper[5008]: I1126 22:53:40.776130 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff97b82db43d48d99c4dfe578691ec81f3b5897b875c8b97e63fa6a50becdcb"} err="failed to get container status \"5ff97b82db43d48d99c4dfe578691ec81f3b5897b875c8b97e63fa6a50becdcb\": rpc error: code = NotFound desc = could not find container \"5ff97b82db43d48d99c4dfe578691ec81f3b5897b875c8b97e63fa6a50becdcb\": container with ID starting with 5ff97b82db43d48d99c4dfe578691ec81f3b5897b875c8b97e63fa6a50becdcb not found: ID does not exist" Nov 26 22:53:41 crc kubenswrapper[5008]: I1126 22:53:41.240818 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" event={"ID":"d0c58bd4-2da5-4770-b562-aad453776b10","Type":"ContainerStarted","Data":"bc05ad5a81c3457be482457c1712709f741ccb8e7683be16e0410e59c465646c"} Nov 26 22:53:41 crc kubenswrapper[5008]: I1126 22:53:41.259564 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" podStartSLOduration=1.404922449 podStartE2EDuration="5.259548281s" podCreationTimestamp="2025-11-26 22:53:36 +0000 UTC" firstStartedPulling="2025-11-26 22:53:36.908111723 +0000 UTC m=+892.320805725" lastFinishedPulling="2025-11-26 22:53:40.762737555 +0000 UTC m=+896.175431557" observedRunningTime="2025-11-26 22:53:41.258054005 +0000 UTC m=+896.670748067" watchObservedRunningTime="2025-11-26 22:53:41.259548281 +0000 UTC m=+896.672242283" Nov 26 22:53:45 crc kubenswrapper[5008]: I1126 22:53:45.882004 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Nov 26 22:53:45 crc kubenswrapper[5008]: E1126 22:53:45.883640 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af763e0c-bae0-4cd8-abcc-980ee5edd7ab" containerName="extract-utilities" Nov 26 22:53:45 crc kubenswrapper[5008]: I1126 22:53:45.883770 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="af763e0c-bae0-4cd8-abcc-980ee5edd7ab" containerName="extract-utilities" Nov 26 22:53:45 crc kubenswrapper[5008]: E1126 22:53:45.883878 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af763e0c-bae0-4cd8-abcc-980ee5edd7ab" containerName="registry-server" Nov 26 22:53:45 crc kubenswrapper[5008]: I1126 22:53:45.883940 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="af763e0c-bae0-4cd8-abcc-980ee5edd7ab" containerName="registry-server" Nov 26 22:53:45 crc kubenswrapper[5008]: E1126 22:53:45.884040 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af763e0c-bae0-4cd8-abcc-980ee5edd7ab" containerName="extract-content" Nov 26 22:53:45 crc kubenswrapper[5008]: I1126 22:53:45.884134 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="af763e0c-bae0-4cd8-abcc-980ee5edd7ab" containerName="extract-content" Nov 26 22:53:45 crc kubenswrapper[5008]: I1126 22:53:45.884348 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="af763e0c-bae0-4cd8-abcc-980ee5edd7ab" containerName="registry-server" Nov 26 22:53:45 crc kubenswrapper[5008]: I1126 22:53:45.885256 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:45 crc kubenswrapper[5008]: I1126 22:53:45.888452 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Nov 26 22:53:45 crc kubenswrapper[5008]: I1126 22:53:45.888683 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Nov 26 22:53:45 crc kubenswrapper[5008]: I1126 22:53:45.888860 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Nov 26 22:53:45 crc kubenswrapper[5008]: I1126 22:53:45.888982 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Nov 26 22:53:45 crc kubenswrapper[5008]: I1126 22:53:45.892901 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-977dp" Nov 26 22:53:45 crc kubenswrapper[5008]: I1126 22:53:45.894907 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.032406 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.032723 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.032835 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.033014 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7c11ae63-a473-4042-9096-b3090e3eb3bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7c11ae63-a473-4042-9096-b3090e3eb3bf\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.033124 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dhhr\" (UniqueName: \"kubernetes.io/projected/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-kube-api-access-6dhhr\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.033229 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.033328 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.033470 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.135137 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.135215 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.135242 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.135270 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.135330 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7c11ae63-a473-4042-9096-b3090e3eb3bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7c11ae63-a473-4042-9096-b3090e3eb3bf\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.135350 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dhhr\" (UniqueName: \"kubernetes.io/projected/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-kube-api-access-6dhhr\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.135375 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.135395 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.135884 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.136548 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.138260 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.139610 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.139744 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7c11ae63-a473-4042-9096-b3090e3eb3bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7c11ae63-a473-4042-9096-b3090e3eb3bf\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8228e73853aabd445882bd921170d904f76a7c67124b0413733b4df91dc5a1f1/globalmount\"" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.141107 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.141199 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.155027 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.157370 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dhhr\" (UniqueName: \"kubernetes.io/projected/0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a-kube-api-access-6dhhr\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.164140 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7c11ae63-a473-4042-9096-b3090e3eb3bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7c11ae63-a473-4042-9096-b3090e3eb3bf\") pod \"rabbitmq-server-0\" (UID: \"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.209621 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:53:46 crc kubenswrapper[5008]: I1126 22:53:46.459900 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Nov 26 22:53:47 crc kubenswrapper[5008]: I1126 22:53:47.293459 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a","Type":"ContainerStarted","Data":"da59f6d65b838da0f7eb99adfbc17e46e52d1db057b9a06e9e4545b9e42472a0"} Nov 26 22:53:48 crc kubenswrapper[5008]: I1126 22:53:48.527179 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-6jhl8"] Nov 26 22:53:48 crc kubenswrapper[5008]: I1126 22:53:48.528707 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6jhl8" Nov 26 22:53:48 crc kubenswrapper[5008]: I1126 22:53:48.531986 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-rgs2j" Nov 26 22:53:48 crc kubenswrapper[5008]: I1126 22:53:48.534545 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-6jhl8"] Nov 26 22:53:48 crc kubenswrapper[5008]: I1126 22:53:48.671400 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24fjs\" (UniqueName: \"kubernetes.io/projected/a94c2bcc-6b84-4dd9-9e46-4d040f82c362-kube-api-access-24fjs\") pod \"keystone-operator-index-6jhl8\" (UID: \"a94c2bcc-6b84-4dd9-9e46-4d040f82c362\") " pod="openstack-operators/keystone-operator-index-6jhl8" Nov 26 22:53:48 crc kubenswrapper[5008]: I1126 22:53:48.772779 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24fjs\" (UniqueName: \"kubernetes.io/projected/a94c2bcc-6b84-4dd9-9e46-4d040f82c362-kube-api-access-24fjs\") pod \"keystone-operator-index-6jhl8\" (UID: \"a94c2bcc-6b84-4dd9-9e46-4d040f82c362\") " pod="openstack-operators/keystone-operator-index-6jhl8" Nov 26 22:53:48 crc kubenswrapper[5008]: I1126 22:53:48.803904 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24fjs\" (UniqueName: \"kubernetes.io/projected/a94c2bcc-6b84-4dd9-9e46-4d040f82c362-kube-api-access-24fjs\") pod \"keystone-operator-index-6jhl8\" (UID: \"a94c2bcc-6b84-4dd9-9e46-4d040f82c362\") " pod="openstack-operators/keystone-operator-index-6jhl8" Nov 26 22:53:48 crc kubenswrapper[5008]: I1126 22:53:48.849460 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6jhl8" Nov 26 22:53:51 crc kubenswrapper[5008]: I1126 22:53:51.311334 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-6jhl8"] Nov 26 22:53:52 crc kubenswrapper[5008]: I1126 22:53:52.340137 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6jhl8" event={"ID":"a94c2bcc-6b84-4dd9-9e46-4d040f82c362","Type":"ContainerStarted","Data":"100f72bffe2169a9f27bbb0791cfbc6b730cbcbfb45461910d0af2643622d602"} Nov 26 22:53:53 crc kubenswrapper[5008]: I1126 22:53:53.354090 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a","Type":"ContainerStarted","Data":"9cd67af9d95b223fe311a2ab24ea9c8473c50d6b8b468fc129e0f7fe5ef9377c"} Nov 26 22:53:54 crc kubenswrapper[5008]: I1126 22:53:54.372533 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6jhl8" event={"ID":"a94c2bcc-6b84-4dd9-9e46-4d040f82c362","Type":"ContainerStarted","Data":"0301f574c9b0fe62a11cdea374cc521f9bff95823d1d51884012f4aeba64c974"} Nov 26 22:53:54 crc kubenswrapper[5008]: I1126 22:53:54.391192 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-6jhl8" podStartSLOduration=4.230723128 podStartE2EDuration="6.391169558s" podCreationTimestamp="2025-11-26 22:53:48 +0000 UTC" firstStartedPulling="2025-11-26 22:53:51.657660145 +0000 UTC m=+907.070354147" lastFinishedPulling="2025-11-26 22:53:53.818106575 +0000 UTC m=+909.230800577" observedRunningTime="2025-11-26 22:53:54.391001783 +0000 UTC m=+909.803695825" watchObservedRunningTime="2025-11-26 22:53:54.391169558 +0000 UTC m=+909.803863590" Nov 26 22:53:58 crc kubenswrapper[5008]: I1126 22:53:58.849851 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-6jhl8" Nov 26 22:53:58 crc kubenswrapper[5008]: I1126 22:53:58.850164 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-6jhl8" Nov 26 22:53:58 crc kubenswrapper[5008]: I1126 22:53:58.897676 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-6jhl8" Nov 26 22:53:59 crc kubenswrapper[5008]: I1126 22:53:59.281176 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:53:59 crc kubenswrapper[5008]: I1126 22:53:59.281267 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:53:59 crc kubenswrapper[5008]: I1126 22:53:59.445561 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-6jhl8" Nov 26 22:54:01 crc kubenswrapper[5008]: I1126 22:54:01.987536 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6"] Nov 26 22:54:01 crc kubenswrapper[5008]: I1126 22:54:01.990658 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" Nov 26 22:54:02 crc kubenswrapper[5008]: I1126 22:54:02.003432 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6kdt6" Nov 26 22:54:02 crc kubenswrapper[5008]: I1126 22:54:02.011810 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6"] Nov 26 22:54:02 crc kubenswrapper[5008]: I1126 22:54:02.075384 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-util\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6\" (UID: \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" Nov 26 22:54:02 crc kubenswrapper[5008]: I1126 22:54:02.075456 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kxq8\" (UniqueName: \"kubernetes.io/projected/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-kube-api-access-8kxq8\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6\" (UID: \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" Nov 26 22:54:02 crc kubenswrapper[5008]: I1126 22:54:02.075506 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-bundle\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6\" (UID: \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" Nov 26 22:54:02 crc kubenswrapper[5008]: I1126 22:54:02.176745 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kxq8\" (UniqueName: \"kubernetes.io/projected/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-kube-api-access-8kxq8\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6\" (UID: \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" Nov 26 22:54:02 crc kubenswrapper[5008]: I1126 22:54:02.177291 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-bundle\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6\" (UID: \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" Nov 26 22:54:02 crc kubenswrapper[5008]: I1126 22:54:02.177443 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-util\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6\" (UID: \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" Nov 26 22:54:02 crc kubenswrapper[5008]: I1126 22:54:02.178076 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-bundle\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6\" (UID: \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" Nov 26 22:54:02 crc kubenswrapper[5008]: I1126 22:54:02.178297 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-util\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6\" (UID: \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" Nov 26 22:54:02 crc kubenswrapper[5008]: I1126 22:54:02.201937 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kxq8\" (UniqueName: \"kubernetes.io/projected/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-kube-api-access-8kxq8\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6\" (UID: \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" Nov 26 22:54:02 crc kubenswrapper[5008]: I1126 22:54:02.320673 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" Nov 26 22:54:02 crc kubenswrapper[5008]: I1126 22:54:02.595400 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6"] Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.437465 5008 generic.go:334] "Generic (PLEG): container finished" podID="6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5" containerID="9384242627e13b16242643a2133cfb347fd6002a241c4942ae40357fbca6a1ae" exitCode=0 Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.437539 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" event={"ID":"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5","Type":"ContainerDied","Data":"9384242627e13b16242643a2133cfb347fd6002a241c4942ae40357fbca6a1ae"} Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.437613 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" event={"ID":"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5","Type":"ContainerStarted","Data":"36b383487559ac38bb66bdb2bf021aa61d5b06f190b1d354607baeac2730603d"} Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.745017 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h52j5"] Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.748086 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.765235 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h52j5"] Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.801792 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smcf9\" (UniqueName: \"kubernetes.io/projected/c3270387-8a5d-4bf3-b36c-7bf569c88370-kube-api-access-smcf9\") pod \"certified-operators-h52j5\" (UID: \"c3270387-8a5d-4bf3-b36c-7bf569c88370\") " pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.801899 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3270387-8a5d-4bf3-b36c-7bf569c88370-utilities\") pod \"certified-operators-h52j5\" (UID: \"c3270387-8a5d-4bf3-b36c-7bf569c88370\") " pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.802087 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3270387-8a5d-4bf3-b36c-7bf569c88370-catalog-content\") pod \"certified-operators-h52j5\" (UID: \"c3270387-8a5d-4bf3-b36c-7bf569c88370\") " pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.903756 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3270387-8a5d-4bf3-b36c-7bf569c88370-catalog-content\") pod \"certified-operators-h52j5\" (UID: \"c3270387-8a5d-4bf3-b36c-7bf569c88370\") " pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.904099 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smcf9\" (UniqueName: \"kubernetes.io/projected/c3270387-8a5d-4bf3-b36c-7bf569c88370-kube-api-access-smcf9\") pod \"certified-operators-h52j5\" (UID: \"c3270387-8a5d-4bf3-b36c-7bf569c88370\") " pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.904129 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3270387-8a5d-4bf3-b36c-7bf569c88370-utilities\") pod \"certified-operators-h52j5\" (UID: \"c3270387-8a5d-4bf3-b36c-7bf569c88370\") " pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.904331 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3270387-8a5d-4bf3-b36c-7bf569c88370-catalog-content\") pod \"certified-operators-h52j5\" (UID: \"c3270387-8a5d-4bf3-b36c-7bf569c88370\") " pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.904525 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3270387-8a5d-4bf3-b36c-7bf569c88370-utilities\") pod \"certified-operators-h52j5\" (UID: \"c3270387-8a5d-4bf3-b36c-7bf569c88370\") " pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:03 crc kubenswrapper[5008]: I1126 22:54:03.937189 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smcf9\" (UniqueName: \"kubernetes.io/projected/c3270387-8a5d-4bf3-b36c-7bf569c88370-kube-api-access-smcf9\") pod \"certified-operators-h52j5\" (UID: \"c3270387-8a5d-4bf3-b36c-7bf569c88370\") " pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:04 crc kubenswrapper[5008]: I1126 22:54:04.067629 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:04 crc kubenswrapper[5008]: I1126 22:54:04.447211 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" event={"ID":"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5","Type":"ContainerStarted","Data":"9304a985e2ee3d38a269ba856019b83ed09bd78c48765c4a9f8ff9993a20adb0"} Nov 26 22:54:04 crc kubenswrapper[5008]: I1126 22:54:04.505389 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h52j5"] Nov 26 22:54:04 crc kubenswrapper[5008]: W1126 22:54:04.512143 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3270387_8a5d_4bf3_b36c_7bf569c88370.slice/crio-147cec6fd41720877587ca2d4ac0d3e4d1619cbeb85972ba9fd5b032b838156b WatchSource:0}: Error finding container 147cec6fd41720877587ca2d4ac0d3e4d1619cbeb85972ba9fd5b032b838156b: Status 404 returned error can't find the container with id 147cec6fd41720877587ca2d4ac0d3e4d1619cbeb85972ba9fd5b032b838156b Nov 26 22:54:05 crc kubenswrapper[5008]: I1126 22:54:05.458599 5008 generic.go:334] "Generic (PLEG): container finished" podID="c3270387-8a5d-4bf3-b36c-7bf569c88370" containerID="727e62b8217d4a34477bc0b004d5873ed90c35f66899b021594547460ef2acda" exitCode=0 Nov 26 22:54:05 crc kubenswrapper[5008]: I1126 22:54:05.458751 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h52j5" event={"ID":"c3270387-8a5d-4bf3-b36c-7bf569c88370","Type":"ContainerDied","Data":"727e62b8217d4a34477bc0b004d5873ed90c35f66899b021594547460ef2acda"} Nov 26 22:54:05 crc kubenswrapper[5008]: I1126 22:54:05.460890 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h52j5" event={"ID":"c3270387-8a5d-4bf3-b36c-7bf569c88370","Type":"ContainerStarted","Data":"147cec6fd41720877587ca2d4ac0d3e4d1619cbeb85972ba9fd5b032b838156b"} Nov 26 22:54:05 crc kubenswrapper[5008]: I1126 22:54:05.467780 5008 generic.go:334] "Generic (PLEG): container finished" podID="6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5" containerID="9304a985e2ee3d38a269ba856019b83ed09bd78c48765c4a9f8ff9993a20adb0" exitCode=0 Nov 26 22:54:05 crc kubenswrapper[5008]: I1126 22:54:05.468206 5008 generic.go:334] "Generic (PLEG): container finished" podID="6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5" containerID="e036615941814113e9641919c8a85bacfc6543efa4e264ddb2279bd7c708b27b" exitCode=0 Nov 26 22:54:05 crc kubenswrapper[5008]: I1126 22:54:05.468088 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" event={"ID":"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5","Type":"ContainerDied","Data":"9304a985e2ee3d38a269ba856019b83ed09bd78c48765c4a9f8ff9993a20adb0"} Nov 26 22:54:05 crc kubenswrapper[5008]: I1126 22:54:05.468547 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" event={"ID":"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5","Type":"ContainerDied","Data":"e036615941814113e9641919c8a85bacfc6543efa4e264ddb2279bd7c708b27b"} Nov 26 22:54:06 crc kubenswrapper[5008]: I1126 22:54:06.484232 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h52j5" event={"ID":"c3270387-8a5d-4bf3-b36c-7bf569c88370","Type":"ContainerStarted","Data":"cd63e1bfb036e2b07738b485c7cc5c244377f39c6ed1df24bc3ae2974d42329b"} Nov 26 22:54:06 crc kubenswrapper[5008]: I1126 22:54:06.923093 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" Nov 26 22:54:06 crc kubenswrapper[5008]: I1126 22:54:06.953629 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-bundle\") pod \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\" (UID: \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\") " Nov 26 22:54:06 crc kubenswrapper[5008]: I1126 22:54:06.954056 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kxq8\" (UniqueName: \"kubernetes.io/projected/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-kube-api-access-8kxq8\") pod \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\" (UID: \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\") " Nov 26 22:54:06 crc kubenswrapper[5008]: I1126 22:54:06.954093 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-util\") pod \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\" (UID: \"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5\") " Nov 26 22:54:06 crc kubenswrapper[5008]: I1126 22:54:06.955128 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-bundle" (OuterVolumeSpecName: "bundle") pod "6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5" (UID: "6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:54:06 crc kubenswrapper[5008]: I1126 22:54:06.961573 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-kube-api-access-8kxq8" (OuterVolumeSpecName: "kube-api-access-8kxq8") pod "6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5" (UID: "6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5"). InnerVolumeSpecName "kube-api-access-8kxq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:54:06 crc kubenswrapper[5008]: I1126 22:54:06.976235 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-util" (OuterVolumeSpecName: "util") pod "6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5" (UID: "6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:54:07 crc kubenswrapper[5008]: I1126 22:54:07.055899 5008 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:07 crc kubenswrapper[5008]: I1126 22:54:07.056240 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kxq8\" (UniqueName: \"kubernetes.io/projected/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-kube-api-access-8kxq8\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:07 crc kubenswrapper[5008]: I1126 22:54:07.056327 5008 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5-util\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:07 crc kubenswrapper[5008]: I1126 22:54:07.497387 5008 generic.go:334] "Generic (PLEG): container finished" podID="c3270387-8a5d-4bf3-b36c-7bf569c88370" containerID="cd63e1bfb036e2b07738b485c7cc5c244377f39c6ed1df24bc3ae2974d42329b" exitCode=0 Nov 26 22:54:07 crc kubenswrapper[5008]: I1126 22:54:07.497503 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h52j5" event={"ID":"c3270387-8a5d-4bf3-b36c-7bf569c88370","Type":"ContainerDied","Data":"cd63e1bfb036e2b07738b485c7cc5c244377f39c6ed1df24bc3ae2974d42329b"} Nov 26 22:54:07 crc kubenswrapper[5008]: I1126 22:54:07.501215 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" event={"ID":"6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5","Type":"ContainerDied","Data":"36b383487559ac38bb66bdb2bf021aa61d5b06f190b1d354607baeac2730603d"} Nov 26 22:54:07 crc kubenswrapper[5008]: I1126 22:54:07.501272 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36b383487559ac38bb66bdb2bf021aa61d5b06f190b1d354607baeac2730603d" Nov 26 22:54:07 crc kubenswrapper[5008]: I1126 22:54:07.501288 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3msct6" Nov 26 22:54:09 crc kubenswrapper[5008]: I1126 22:54:09.528206 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h52j5" event={"ID":"c3270387-8a5d-4bf3-b36c-7bf569c88370","Type":"ContainerStarted","Data":"2e4aeb1fbaa5e01287d384838974f8ca745b58f89b1fb05919a6e9977ba17340"} Nov 26 22:54:09 crc kubenswrapper[5008]: I1126 22:54:09.543143 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h52j5" podStartSLOduration=3.666204744 podStartE2EDuration="6.543117403s" podCreationTimestamp="2025-11-26 22:54:03 +0000 UTC" firstStartedPulling="2025-11-26 22:54:05.462250972 +0000 UTC m=+920.874945004" lastFinishedPulling="2025-11-26 22:54:08.339163631 +0000 UTC m=+923.751857663" observedRunningTime="2025-11-26 22:54:09.541181901 +0000 UTC m=+924.953875953" watchObservedRunningTime="2025-11-26 22:54:09.543117403 +0000 UTC m=+924.955811435" Nov 26 22:54:14 crc kubenswrapper[5008]: I1126 22:54:14.068895 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:14 crc kubenswrapper[5008]: I1126 22:54:14.069390 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:14 crc kubenswrapper[5008]: I1126 22:54:14.107235 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:14 crc kubenswrapper[5008]: I1126 22:54:14.623158 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.239859 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9"] Nov 26 22:54:17 crc kubenswrapper[5008]: E1126 22:54:17.240104 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5" containerName="extract" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.240116 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5" containerName="extract" Nov 26 22:54:17 crc kubenswrapper[5008]: E1126 22:54:17.240130 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5" containerName="util" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.240136 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5" containerName="util" Nov 26 22:54:17 crc kubenswrapper[5008]: E1126 22:54:17.240152 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5" containerName="pull" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.240158 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5" containerName="pull" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.240258 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1517c5-9f53-42ac-a5a0-61cd0bd2fdc5" containerName="extract" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.240699 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.243351 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.244028 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-dkwr2" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.252670 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9"] Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.309784 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72dcab39-084a-40c0-8646-acf173ea065d-webhook-cert\") pod \"keystone-operator-controller-manager-7f6c555587-tvtn9\" (UID: \"72dcab39-084a-40c0-8646-acf173ea065d\") " pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.309848 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrnwh\" (UniqueName: \"kubernetes.io/projected/72dcab39-084a-40c0-8646-acf173ea065d-kube-api-access-hrnwh\") pod \"keystone-operator-controller-manager-7f6c555587-tvtn9\" (UID: \"72dcab39-084a-40c0-8646-acf173ea065d\") " pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.309882 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72dcab39-084a-40c0-8646-acf173ea065d-apiservice-cert\") pod \"keystone-operator-controller-manager-7f6c555587-tvtn9\" (UID: \"72dcab39-084a-40c0-8646-acf173ea065d\") " pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.411140 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrnwh\" (UniqueName: \"kubernetes.io/projected/72dcab39-084a-40c0-8646-acf173ea065d-kube-api-access-hrnwh\") pod \"keystone-operator-controller-manager-7f6c555587-tvtn9\" (UID: \"72dcab39-084a-40c0-8646-acf173ea065d\") " pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.411207 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72dcab39-084a-40c0-8646-acf173ea065d-apiservice-cert\") pod \"keystone-operator-controller-manager-7f6c555587-tvtn9\" (UID: \"72dcab39-084a-40c0-8646-acf173ea065d\") " pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.411273 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72dcab39-084a-40c0-8646-acf173ea065d-webhook-cert\") pod \"keystone-operator-controller-manager-7f6c555587-tvtn9\" (UID: \"72dcab39-084a-40c0-8646-acf173ea065d\") " pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.424181 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72dcab39-084a-40c0-8646-acf173ea065d-webhook-cert\") pod \"keystone-operator-controller-manager-7f6c555587-tvtn9\" (UID: \"72dcab39-084a-40c0-8646-acf173ea065d\") " pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.434997 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrnwh\" (UniqueName: \"kubernetes.io/projected/72dcab39-084a-40c0-8646-acf173ea065d-kube-api-access-hrnwh\") pod \"keystone-operator-controller-manager-7f6c555587-tvtn9\" (UID: \"72dcab39-084a-40c0-8646-acf173ea065d\") " pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.435395 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72dcab39-084a-40c0-8646-acf173ea065d-apiservice-cert\") pod \"keystone-operator-controller-manager-7f6c555587-tvtn9\" (UID: \"72dcab39-084a-40c0-8646-acf173ea065d\") " pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.565662 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.722540 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h52j5"] Nov 26 22:54:17 crc kubenswrapper[5008]: I1126 22:54:17.723175 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h52j5" podUID="c3270387-8a5d-4bf3-b36c-7bf569c88370" containerName="registry-server" containerID="cri-o://2e4aeb1fbaa5e01287d384838974f8ca745b58f89b1fb05919a6e9977ba17340" gracePeriod=2 Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.054118 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9"] Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.239158 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.328937 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smcf9\" (UniqueName: \"kubernetes.io/projected/c3270387-8a5d-4bf3-b36c-7bf569c88370-kube-api-access-smcf9\") pod \"c3270387-8a5d-4bf3-b36c-7bf569c88370\" (UID: \"c3270387-8a5d-4bf3-b36c-7bf569c88370\") " Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.329033 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3270387-8a5d-4bf3-b36c-7bf569c88370-catalog-content\") pod \"c3270387-8a5d-4bf3-b36c-7bf569c88370\" (UID: \"c3270387-8a5d-4bf3-b36c-7bf569c88370\") " Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.329091 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3270387-8a5d-4bf3-b36c-7bf569c88370-utilities\") pod \"c3270387-8a5d-4bf3-b36c-7bf569c88370\" (UID: \"c3270387-8a5d-4bf3-b36c-7bf569c88370\") " Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.330167 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3270387-8a5d-4bf3-b36c-7bf569c88370-utilities" (OuterVolumeSpecName: "utilities") pod "c3270387-8a5d-4bf3-b36c-7bf569c88370" (UID: "c3270387-8a5d-4bf3-b36c-7bf569c88370"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.334698 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3270387-8a5d-4bf3-b36c-7bf569c88370-kube-api-access-smcf9" (OuterVolumeSpecName: "kube-api-access-smcf9") pod "c3270387-8a5d-4bf3-b36c-7bf569c88370" (UID: "c3270387-8a5d-4bf3-b36c-7bf569c88370"). InnerVolumeSpecName "kube-api-access-smcf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.379193 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3270387-8a5d-4bf3-b36c-7bf569c88370-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3270387-8a5d-4bf3-b36c-7bf569c88370" (UID: "c3270387-8a5d-4bf3-b36c-7bf569c88370"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.430581 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3270387-8a5d-4bf3-b36c-7bf569c88370-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.430621 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smcf9\" (UniqueName: \"kubernetes.io/projected/c3270387-8a5d-4bf3-b36c-7bf569c88370-kube-api-access-smcf9\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.430638 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3270387-8a5d-4bf3-b36c-7bf569c88370-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.593203 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" event={"ID":"72dcab39-084a-40c0-8646-acf173ea065d","Type":"ContainerStarted","Data":"f625fe3dc2671d68b40b3b22071ab03faecb2f9a6028c5f0b5912b01f0ead0e7"} Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.595379 5008 generic.go:334] "Generic (PLEG): container finished" podID="c3270387-8a5d-4bf3-b36c-7bf569c88370" containerID="2e4aeb1fbaa5e01287d384838974f8ca745b58f89b1fb05919a6e9977ba17340" exitCode=0 Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.595434 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h52j5" event={"ID":"c3270387-8a5d-4bf3-b36c-7bf569c88370","Type":"ContainerDied","Data":"2e4aeb1fbaa5e01287d384838974f8ca745b58f89b1fb05919a6e9977ba17340"} Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.595439 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h52j5" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.595675 5008 scope.go:117] "RemoveContainer" containerID="2e4aeb1fbaa5e01287d384838974f8ca745b58f89b1fb05919a6e9977ba17340" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.595655 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h52j5" event={"ID":"c3270387-8a5d-4bf3-b36c-7bf569c88370","Type":"ContainerDied","Data":"147cec6fd41720877587ca2d4ac0d3e4d1619cbeb85972ba9fd5b032b838156b"} Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.614931 5008 scope.go:117] "RemoveContainer" containerID="cd63e1bfb036e2b07738b485c7cc5c244377f39c6ed1df24bc3ae2974d42329b" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.628497 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h52j5"] Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.634234 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h52j5"] Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.645014 5008 scope.go:117] "RemoveContainer" containerID="727e62b8217d4a34477bc0b004d5873ed90c35f66899b021594547460ef2acda" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.663146 5008 scope.go:117] "RemoveContainer" containerID="2e4aeb1fbaa5e01287d384838974f8ca745b58f89b1fb05919a6e9977ba17340" Nov 26 22:54:18 crc kubenswrapper[5008]: E1126 22:54:18.663775 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e4aeb1fbaa5e01287d384838974f8ca745b58f89b1fb05919a6e9977ba17340\": container with ID starting with 2e4aeb1fbaa5e01287d384838974f8ca745b58f89b1fb05919a6e9977ba17340 not found: ID does not exist" containerID="2e4aeb1fbaa5e01287d384838974f8ca745b58f89b1fb05919a6e9977ba17340" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.663817 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e4aeb1fbaa5e01287d384838974f8ca745b58f89b1fb05919a6e9977ba17340"} err="failed to get container status \"2e4aeb1fbaa5e01287d384838974f8ca745b58f89b1fb05919a6e9977ba17340\": rpc error: code = NotFound desc = could not find container \"2e4aeb1fbaa5e01287d384838974f8ca745b58f89b1fb05919a6e9977ba17340\": container with ID starting with 2e4aeb1fbaa5e01287d384838974f8ca745b58f89b1fb05919a6e9977ba17340 not found: ID does not exist" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.663840 5008 scope.go:117] "RemoveContainer" containerID="cd63e1bfb036e2b07738b485c7cc5c244377f39c6ed1df24bc3ae2974d42329b" Nov 26 22:54:18 crc kubenswrapper[5008]: E1126 22:54:18.664185 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd63e1bfb036e2b07738b485c7cc5c244377f39c6ed1df24bc3ae2974d42329b\": container with ID starting with cd63e1bfb036e2b07738b485c7cc5c244377f39c6ed1df24bc3ae2974d42329b not found: ID does not exist" containerID="cd63e1bfb036e2b07738b485c7cc5c244377f39c6ed1df24bc3ae2974d42329b" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.664230 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd63e1bfb036e2b07738b485c7cc5c244377f39c6ed1df24bc3ae2974d42329b"} err="failed to get container status \"cd63e1bfb036e2b07738b485c7cc5c244377f39c6ed1df24bc3ae2974d42329b\": rpc error: code = NotFound desc = could not find container \"cd63e1bfb036e2b07738b485c7cc5c244377f39c6ed1df24bc3ae2974d42329b\": container with ID starting with cd63e1bfb036e2b07738b485c7cc5c244377f39c6ed1df24bc3ae2974d42329b not found: ID does not exist" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.664251 5008 scope.go:117] "RemoveContainer" containerID="727e62b8217d4a34477bc0b004d5873ed90c35f66899b021594547460ef2acda" Nov 26 22:54:18 crc kubenswrapper[5008]: E1126 22:54:18.664572 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"727e62b8217d4a34477bc0b004d5873ed90c35f66899b021594547460ef2acda\": container with ID starting with 727e62b8217d4a34477bc0b004d5873ed90c35f66899b021594547460ef2acda not found: ID does not exist" containerID="727e62b8217d4a34477bc0b004d5873ed90c35f66899b021594547460ef2acda" Nov 26 22:54:18 crc kubenswrapper[5008]: I1126 22:54:18.664603 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"727e62b8217d4a34477bc0b004d5873ed90c35f66899b021594547460ef2acda"} err="failed to get container status \"727e62b8217d4a34477bc0b004d5873ed90c35f66899b021594547460ef2acda\": rpc error: code = NotFound desc = could not find container \"727e62b8217d4a34477bc0b004d5873ed90c35f66899b021594547460ef2acda\": container with ID starting with 727e62b8217d4a34477bc0b004d5873ed90c35f66899b021594547460ef2acda not found: ID does not exist" Nov 26 22:54:19 crc kubenswrapper[5008]: I1126 22:54:19.545542 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3270387-8a5d-4bf3-b36c-7bf569c88370" path="/var/lib/kubelet/pods/c3270387-8a5d-4bf3-b36c-7bf569c88370/volumes" Nov 26 22:54:22 crc kubenswrapper[5008]: I1126 22:54:22.628035 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" event={"ID":"72dcab39-084a-40c0-8646-acf173ea065d","Type":"ContainerStarted","Data":"703975cab045914a73e041489c2576731e07258d37640438c70791f25ae67c2f"} Nov 26 22:54:22 crc kubenswrapper[5008]: I1126 22:54:22.628616 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 22:54:22 crc kubenswrapper[5008]: I1126 22:54:22.645212 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" podStartSLOduration=2.042356069 podStartE2EDuration="5.645196351s" podCreationTimestamp="2025-11-26 22:54:17 +0000 UTC" firstStartedPulling="2025-11-26 22:54:18.070651473 +0000 UTC m=+933.483345475" lastFinishedPulling="2025-11-26 22:54:21.673491755 +0000 UTC m=+937.086185757" observedRunningTime="2025-11-26 22:54:22.642160856 +0000 UTC m=+938.054854878" watchObservedRunningTime="2025-11-26 22:54:22.645196351 +0000 UTC m=+938.057890353" Nov 26 22:54:25 crc kubenswrapper[5008]: I1126 22:54:25.651158 5008 generic.go:334] "Generic (PLEG): container finished" podID="0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a" containerID="9cd67af9d95b223fe311a2ab24ea9c8473c50d6b8b468fc129e0f7fe5ef9377c" exitCode=0 Nov 26 22:54:25 crc kubenswrapper[5008]: I1126 22:54:25.651334 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a","Type":"ContainerDied","Data":"9cd67af9d95b223fe311a2ab24ea9c8473c50d6b8b468fc129e0f7fe5ef9377c"} Nov 26 22:54:26 crc kubenswrapper[5008]: I1126 22:54:26.660454 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"0dd74a4d-97a5-43c8-be0a-efb1fa3ab49a","Type":"ContainerStarted","Data":"ceee38325720b97c4a47f0a8b2ee3d9476c1d62ea4680efe3b4b231bac270984"} Nov 26 22:54:26 crc kubenswrapper[5008]: I1126 22:54:26.661220 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:54:26 crc kubenswrapper[5008]: I1126 22:54:26.684437 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.441888769 podStartE2EDuration="42.684413921s" podCreationTimestamp="2025-11-26 22:53:44 +0000 UTC" firstStartedPulling="2025-11-26 22:53:46.476356146 +0000 UTC m=+901.889050148" lastFinishedPulling="2025-11-26 22:53:51.718881288 +0000 UTC m=+907.131575300" observedRunningTime="2025-11-26 22:54:26.681275863 +0000 UTC m=+942.093969885" watchObservedRunningTime="2025-11-26 22:54:26.684413921 +0000 UTC m=+942.097107963" Nov 26 22:54:27 crc kubenswrapper[5008]: I1126 22:54:27.569626 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 22:54:29 crc kubenswrapper[5008]: I1126 22:54:29.281452 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:54:29 crc kubenswrapper[5008]: I1126 22:54:29.281857 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.272449 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx"] Nov 26 22:54:32 crc kubenswrapper[5008]: E1126 22:54:32.272916 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3270387-8a5d-4bf3-b36c-7bf569c88370" containerName="extract-content" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.272929 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3270387-8a5d-4bf3-b36c-7bf569c88370" containerName="extract-content" Nov 26 22:54:32 crc kubenswrapper[5008]: E1126 22:54:32.272945 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3270387-8a5d-4bf3-b36c-7bf569c88370" containerName="extract-utilities" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.272952 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3270387-8a5d-4bf3-b36c-7bf569c88370" containerName="extract-utilities" Nov 26 22:54:32 crc kubenswrapper[5008]: E1126 22:54:32.272983 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3270387-8a5d-4bf3-b36c-7bf569c88370" containerName="registry-server" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.272992 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3270387-8a5d-4bf3-b36c-7bf569c88370" containerName="registry-server" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.273130 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3270387-8a5d-4bf3-b36c-7bf569c88370" containerName="registry-server" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.273584 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.277466 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-db-secret" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.288418 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx"] Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.332070 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e27f185-277d-4332-bb8c-1968a0b82b01-operator-scripts\") pod \"keystone-12a7-account-create-update-hd8tx\" (UID: \"9e27f185-277d-4332-bb8c-1968a0b82b01\") " pod="glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.332253 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kkx5\" (UniqueName: \"kubernetes.io/projected/9e27f185-277d-4332-bb8c-1968a0b82b01-kube-api-access-2kkx5\") pod \"keystone-12a7-account-create-update-hd8tx\" (UID: \"9e27f185-277d-4332-bb8c-1968a0b82b01\") " pod="glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.367388 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-create-k9zcg"] Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.369119 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-k9zcg" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.374074 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-k9zcg"] Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.433801 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kkx5\" (UniqueName: \"kubernetes.io/projected/9e27f185-277d-4332-bb8c-1968a0b82b01-kube-api-access-2kkx5\") pod \"keystone-12a7-account-create-update-hd8tx\" (UID: \"9e27f185-277d-4332-bb8c-1968a0b82b01\") " pod="glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.433882 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1a3cd7-269b-4810-8e91-fba246cea26f-operator-scripts\") pod \"keystone-db-create-k9zcg\" (UID: \"8c1a3cd7-269b-4810-8e91-fba246cea26f\") " pod="glance-kuttl-tests/keystone-db-create-k9zcg" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.433937 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn54t\" (UniqueName: \"kubernetes.io/projected/8c1a3cd7-269b-4810-8e91-fba246cea26f-kube-api-access-vn54t\") pod \"keystone-db-create-k9zcg\" (UID: \"8c1a3cd7-269b-4810-8e91-fba246cea26f\") " pod="glance-kuttl-tests/keystone-db-create-k9zcg" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.434208 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e27f185-277d-4332-bb8c-1968a0b82b01-operator-scripts\") pod \"keystone-12a7-account-create-update-hd8tx\" (UID: \"9e27f185-277d-4332-bb8c-1968a0b82b01\") " pod="glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.435327 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e27f185-277d-4332-bb8c-1968a0b82b01-operator-scripts\") pod \"keystone-12a7-account-create-update-hd8tx\" (UID: \"9e27f185-277d-4332-bb8c-1968a0b82b01\") " pod="glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.463650 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kkx5\" (UniqueName: \"kubernetes.io/projected/9e27f185-277d-4332-bb8c-1968a0b82b01-kube-api-access-2kkx5\") pod \"keystone-12a7-account-create-update-hd8tx\" (UID: \"9e27f185-277d-4332-bb8c-1968a0b82b01\") " pod="glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.528709 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-rksbv"] Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.529948 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-rksbv" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.533985 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-8rxbq" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.536099 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1a3cd7-269b-4810-8e91-fba246cea26f-operator-scripts\") pod \"keystone-db-create-k9zcg\" (UID: \"8c1a3cd7-269b-4810-8e91-fba246cea26f\") " pod="glance-kuttl-tests/keystone-db-create-k9zcg" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.536237 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn54t\" (UniqueName: \"kubernetes.io/projected/8c1a3cd7-269b-4810-8e91-fba246cea26f-kube-api-access-vn54t\") pod \"keystone-db-create-k9zcg\" (UID: \"8c1a3cd7-269b-4810-8e91-fba246cea26f\") " pod="glance-kuttl-tests/keystone-db-create-k9zcg" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.536748 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1a3cd7-269b-4810-8e91-fba246cea26f-operator-scripts\") pod \"keystone-db-create-k9zcg\" (UID: \"8c1a3cd7-269b-4810-8e91-fba246cea26f\") " pod="glance-kuttl-tests/keystone-db-create-k9zcg" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.537742 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-rksbv"] Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.564759 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn54t\" (UniqueName: \"kubernetes.io/projected/8c1a3cd7-269b-4810-8e91-fba246cea26f-kube-api-access-vn54t\") pod \"keystone-db-create-k9zcg\" (UID: \"8c1a3cd7-269b-4810-8e91-fba246cea26f\") " pod="glance-kuttl-tests/keystone-db-create-k9zcg" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.595507 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.637635 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx7jh\" (UniqueName: \"kubernetes.io/projected/a27cbbc0-6a27-4b03-8788-c8f3fdd090b1-kube-api-access-lx7jh\") pod \"horizon-operator-index-rksbv\" (UID: \"a27cbbc0-6a27-4b03-8788-c8f3fdd090b1\") " pod="openstack-operators/horizon-operator-index-rksbv" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.695367 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-k9zcg" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.739681 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx7jh\" (UniqueName: \"kubernetes.io/projected/a27cbbc0-6a27-4b03-8788-c8f3fdd090b1-kube-api-access-lx7jh\") pod \"horizon-operator-index-rksbv\" (UID: \"a27cbbc0-6a27-4b03-8788-c8f3fdd090b1\") " pod="openstack-operators/horizon-operator-index-rksbv" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.760511 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx7jh\" (UniqueName: \"kubernetes.io/projected/a27cbbc0-6a27-4b03-8788-c8f3fdd090b1-kube-api-access-lx7jh\") pod \"horizon-operator-index-rksbv\" (UID: \"a27cbbc0-6a27-4b03-8788-c8f3fdd090b1\") " pod="openstack-operators/horizon-operator-index-rksbv" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.849343 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-rksbv" Nov 26 22:54:32 crc kubenswrapper[5008]: I1126 22:54:32.941420 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-k9zcg"] Nov 26 22:54:32 crc kubenswrapper[5008]: W1126 22:54:32.950695 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c1a3cd7_269b_4810_8e91_fba246cea26f.slice/crio-e621622c11d968cf03cc27a82eec68a16568fb5dddb541657c3aa589940e64a2 WatchSource:0}: Error finding container e621622c11d968cf03cc27a82eec68a16568fb5dddb541657c3aa589940e64a2: Status 404 returned error can't find the container with id e621622c11d968cf03cc27a82eec68a16568fb5dddb541657c3aa589940e64a2 Nov 26 22:54:33 crc kubenswrapper[5008]: I1126 22:54:33.069734 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx"] Nov 26 22:54:33 crc kubenswrapper[5008]: I1126 22:54:33.096808 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-rksbv"] Nov 26 22:54:33 crc kubenswrapper[5008]: I1126 22:54:33.724268 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-rksbv" event={"ID":"a27cbbc0-6a27-4b03-8788-c8f3fdd090b1","Type":"ContainerStarted","Data":"a377190cbcdb342a18779b88b1133a155a9061fe4c873769e19d5d55bd6f5a3c"} Nov 26 22:54:33 crc kubenswrapper[5008]: I1126 22:54:33.727370 5008 generic.go:334] "Generic (PLEG): container finished" podID="8c1a3cd7-269b-4810-8e91-fba246cea26f" containerID="b422c1346199777f9fe0a391ad708bc942662514eb2b5982f5724e360208e8e1" exitCode=0 Nov 26 22:54:33 crc kubenswrapper[5008]: I1126 22:54:33.727466 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-k9zcg" event={"ID":"8c1a3cd7-269b-4810-8e91-fba246cea26f","Type":"ContainerDied","Data":"b422c1346199777f9fe0a391ad708bc942662514eb2b5982f5724e360208e8e1"} Nov 26 22:54:33 crc kubenswrapper[5008]: I1126 22:54:33.727502 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-k9zcg" event={"ID":"8c1a3cd7-269b-4810-8e91-fba246cea26f","Type":"ContainerStarted","Data":"e621622c11d968cf03cc27a82eec68a16568fb5dddb541657c3aa589940e64a2"} Nov 26 22:54:33 crc kubenswrapper[5008]: I1126 22:54:33.730453 5008 generic.go:334] "Generic (PLEG): container finished" podID="9e27f185-277d-4332-bb8c-1968a0b82b01" containerID="cb849b91ffbbef6c85abc8c6b054d6eaf32456c2c533cadf9df6e367cbd6f2ef" exitCode=0 Nov 26 22:54:33 crc kubenswrapper[5008]: I1126 22:54:33.730529 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx" event={"ID":"9e27f185-277d-4332-bb8c-1968a0b82b01","Type":"ContainerDied","Data":"cb849b91ffbbef6c85abc8c6b054d6eaf32456c2c533cadf9df6e367cbd6f2ef"} Nov 26 22:54:33 crc kubenswrapper[5008]: I1126 22:54:33.730582 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx" event={"ID":"9e27f185-277d-4332-bb8c-1968a0b82b01","Type":"ContainerStarted","Data":"f4783a1c42f4b6d99075125ee71577217699a611d0aae47d93f88edf5e58bc41"} Nov 26 22:54:34 crc kubenswrapper[5008]: I1126 22:54:34.742833 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-rksbv" event={"ID":"a27cbbc0-6a27-4b03-8788-c8f3fdd090b1","Type":"ContainerStarted","Data":"045368fac2d0f09a8ddc772edc60d98ee0ea2941396ac8b4c71f25f6bb79ffe6"} Nov 26 22:54:34 crc kubenswrapper[5008]: I1126 22:54:34.770516 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-rksbv" podStartSLOduration=1.786381617 podStartE2EDuration="2.770489462s" podCreationTimestamp="2025-11-26 22:54:32 +0000 UTC" firstStartedPulling="2025-11-26 22:54:33.111891108 +0000 UTC m=+948.524585110" lastFinishedPulling="2025-11-26 22:54:34.095998913 +0000 UTC m=+949.508692955" observedRunningTime="2025-11-26 22:54:34.764690591 +0000 UTC m=+950.177384633" watchObservedRunningTime="2025-11-26 22:54:34.770489462 +0000 UTC m=+950.183183504" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.100670 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-k9zcg" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.105086 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.173822 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn54t\" (UniqueName: \"kubernetes.io/projected/8c1a3cd7-269b-4810-8e91-fba246cea26f-kube-api-access-vn54t\") pod \"8c1a3cd7-269b-4810-8e91-fba246cea26f\" (UID: \"8c1a3cd7-269b-4810-8e91-fba246cea26f\") " Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.173874 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kkx5\" (UniqueName: \"kubernetes.io/projected/9e27f185-277d-4332-bb8c-1968a0b82b01-kube-api-access-2kkx5\") pod \"9e27f185-277d-4332-bb8c-1968a0b82b01\" (UID: \"9e27f185-277d-4332-bb8c-1968a0b82b01\") " Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.174011 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e27f185-277d-4332-bb8c-1968a0b82b01-operator-scripts\") pod \"9e27f185-277d-4332-bb8c-1968a0b82b01\" (UID: \"9e27f185-277d-4332-bb8c-1968a0b82b01\") " Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.174042 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1a3cd7-269b-4810-8e91-fba246cea26f-operator-scripts\") pod \"8c1a3cd7-269b-4810-8e91-fba246cea26f\" (UID: \"8c1a3cd7-269b-4810-8e91-fba246cea26f\") " Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.175116 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c1a3cd7-269b-4810-8e91-fba246cea26f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c1a3cd7-269b-4810-8e91-fba246cea26f" (UID: "8c1a3cd7-269b-4810-8e91-fba246cea26f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.175149 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e27f185-277d-4332-bb8c-1968a0b82b01-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e27f185-277d-4332-bb8c-1968a0b82b01" (UID: "9e27f185-277d-4332-bb8c-1968a0b82b01"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.180985 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e27f185-277d-4332-bb8c-1968a0b82b01-kube-api-access-2kkx5" (OuterVolumeSpecName: "kube-api-access-2kkx5") pod "9e27f185-277d-4332-bb8c-1968a0b82b01" (UID: "9e27f185-277d-4332-bb8c-1968a0b82b01"). InnerVolumeSpecName "kube-api-access-2kkx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.185115 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c1a3cd7-269b-4810-8e91-fba246cea26f-kube-api-access-vn54t" (OuterVolumeSpecName: "kube-api-access-vn54t") pod "8c1a3cd7-269b-4810-8e91-fba246cea26f" (UID: "8c1a3cd7-269b-4810-8e91-fba246cea26f"). InnerVolumeSpecName "kube-api-access-vn54t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.277703 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn54t\" (UniqueName: \"kubernetes.io/projected/8c1a3cd7-269b-4810-8e91-fba246cea26f-kube-api-access-vn54t\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.277756 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kkx5\" (UniqueName: \"kubernetes.io/projected/9e27f185-277d-4332-bb8c-1968a0b82b01-kube-api-access-2kkx5\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.277777 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e27f185-277d-4332-bb8c-1968a0b82b01-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.277794 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1a3cd7-269b-4810-8e91-fba246cea26f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.754671 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx" event={"ID":"9e27f185-277d-4332-bb8c-1968a0b82b01","Type":"ContainerDied","Data":"f4783a1c42f4b6d99075125ee71577217699a611d0aae47d93f88edf5e58bc41"} Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.755056 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4783a1c42f4b6d99075125ee71577217699a611d0aae47d93f88edf5e58bc41" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.754685 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-12a7-account-create-update-hd8tx" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.758014 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-k9zcg" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.758131 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-k9zcg" event={"ID":"8c1a3cd7-269b-4810-8e91-fba246cea26f","Type":"ContainerDied","Data":"e621622c11d968cf03cc27a82eec68a16568fb5dddb541657c3aa589940e64a2"} Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.758175 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e621622c11d968cf03cc27a82eec68a16568fb5dddb541657c3aa589940e64a2" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.928984 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-6htcj"] Nov 26 22:54:35 crc kubenswrapper[5008]: E1126 22:54:35.929355 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1a3cd7-269b-4810-8e91-fba246cea26f" containerName="mariadb-database-create" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.929383 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1a3cd7-269b-4810-8e91-fba246cea26f" containerName="mariadb-database-create" Nov 26 22:54:35 crc kubenswrapper[5008]: E1126 22:54:35.929407 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e27f185-277d-4332-bb8c-1968a0b82b01" containerName="mariadb-account-create-update" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.929420 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e27f185-277d-4332-bb8c-1968a0b82b01" containerName="mariadb-account-create-update" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.929606 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e27f185-277d-4332-bb8c-1968a0b82b01" containerName="mariadb-account-create-update" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.929640 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c1a3cd7-269b-4810-8e91-fba246cea26f" containerName="mariadb-database-create" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.930494 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-6htcj" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.932930 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-qzk5r" Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.941260 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-6htcj"] Nov 26 22:54:35 crc kubenswrapper[5008]: I1126 22:54:35.989295 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qp7z\" (UniqueName: \"kubernetes.io/projected/fc81a358-693d-48c1-bcc1-3c1ba80a1e22-kube-api-access-2qp7z\") pod \"swift-operator-index-6htcj\" (UID: \"fc81a358-693d-48c1-bcc1-3c1ba80a1e22\") " pod="openstack-operators/swift-operator-index-6htcj" Nov 26 22:54:36 crc kubenswrapper[5008]: I1126 22:54:36.090433 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qp7z\" (UniqueName: \"kubernetes.io/projected/fc81a358-693d-48c1-bcc1-3c1ba80a1e22-kube-api-access-2qp7z\") pod \"swift-operator-index-6htcj\" (UID: \"fc81a358-693d-48c1-bcc1-3c1ba80a1e22\") " pod="openstack-operators/swift-operator-index-6htcj" Nov 26 22:54:36 crc kubenswrapper[5008]: I1126 22:54:36.119225 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qp7z\" (UniqueName: \"kubernetes.io/projected/fc81a358-693d-48c1-bcc1-3c1ba80a1e22-kube-api-access-2qp7z\") pod \"swift-operator-index-6htcj\" (UID: \"fc81a358-693d-48c1-bcc1-3c1ba80a1e22\") " pod="openstack-operators/swift-operator-index-6htcj" Nov 26 22:54:36 crc kubenswrapper[5008]: I1126 22:54:36.214155 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 22:54:36 crc kubenswrapper[5008]: I1126 22:54:36.262020 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-6htcj" Nov 26 22:54:36 crc kubenswrapper[5008]: I1126 22:54:36.704223 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-6htcj"] Nov 26 22:54:36 crc kubenswrapper[5008]: I1126 22:54:36.764952 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-6htcj" event={"ID":"fc81a358-693d-48c1-bcc1-3c1ba80a1e22","Type":"ContainerStarted","Data":"bf4dd72008ea1ebb8e8a4df4b5ded503ab0f6887ec5817fc0006a8782acba0c1"} Nov 26 22:54:37 crc kubenswrapper[5008]: I1126 22:54:37.773922 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-6htcj" event={"ID":"fc81a358-693d-48c1-bcc1-3c1ba80a1e22","Type":"ContainerStarted","Data":"c1d14ed5237b3a8dfc2c179dc57e84b541be5ab806f9bb6e27b0929c000d33bc"} Nov 26 22:54:37 crc kubenswrapper[5008]: I1126 22:54:37.795056 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-6htcj" podStartSLOduration=2.006458475 podStartE2EDuration="2.795030098s" podCreationTimestamp="2025-11-26 22:54:35 +0000 UTC" firstStartedPulling="2025-11-26 22:54:36.714009568 +0000 UTC m=+952.126703570" lastFinishedPulling="2025-11-26 22:54:37.502581191 +0000 UTC m=+952.915275193" observedRunningTime="2025-11-26 22:54:37.792168008 +0000 UTC m=+953.204862020" watchObservedRunningTime="2025-11-26 22:54:37.795030098 +0000 UTC m=+953.207724120" Nov 26 22:54:37 crc kubenswrapper[5008]: I1126 22:54:37.933064 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-sync-th5nj"] Nov 26 22:54:37 crc kubenswrapper[5008]: I1126 22:54:37.933788 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-th5nj" Nov 26 22:54:37 crc kubenswrapper[5008]: I1126 22:54:37.938209 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Nov 26 22:54:37 crc kubenswrapper[5008]: I1126 22:54:37.938917 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-vmzv2" Nov 26 22:54:37 crc kubenswrapper[5008]: I1126 22:54:37.940114 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Nov 26 22:54:37 crc kubenswrapper[5008]: I1126 22:54:37.940650 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Nov 26 22:54:37 crc kubenswrapper[5008]: I1126 22:54:37.955945 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-th5nj"] Nov 26 22:54:38 crc kubenswrapper[5008]: I1126 22:54:38.022754 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee44aad-1da5-48b5-8ba9-2e3d60a1c39a-config-data\") pod \"keystone-db-sync-th5nj\" (UID: \"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a\") " pod="glance-kuttl-tests/keystone-db-sync-th5nj" Nov 26 22:54:38 crc kubenswrapper[5008]: I1126 22:54:38.022825 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5jmh\" (UniqueName: \"kubernetes.io/projected/aee44aad-1da5-48b5-8ba9-2e3d60a1c39a-kube-api-access-r5jmh\") pod \"keystone-db-sync-th5nj\" (UID: \"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a\") " pod="glance-kuttl-tests/keystone-db-sync-th5nj" Nov 26 22:54:38 crc kubenswrapper[5008]: I1126 22:54:38.124524 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee44aad-1da5-48b5-8ba9-2e3d60a1c39a-config-data\") pod \"keystone-db-sync-th5nj\" (UID: \"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a\") " pod="glance-kuttl-tests/keystone-db-sync-th5nj" Nov 26 22:54:38 crc kubenswrapper[5008]: I1126 22:54:38.125006 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5jmh\" (UniqueName: \"kubernetes.io/projected/aee44aad-1da5-48b5-8ba9-2e3d60a1c39a-kube-api-access-r5jmh\") pod \"keystone-db-sync-th5nj\" (UID: \"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a\") " pod="glance-kuttl-tests/keystone-db-sync-th5nj" Nov 26 22:54:38 crc kubenswrapper[5008]: I1126 22:54:38.134478 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee44aad-1da5-48b5-8ba9-2e3d60a1c39a-config-data\") pod \"keystone-db-sync-th5nj\" (UID: \"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a\") " pod="glance-kuttl-tests/keystone-db-sync-th5nj" Nov 26 22:54:38 crc kubenswrapper[5008]: I1126 22:54:38.146263 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5jmh\" (UniqueName: \"kubernetes.io/projected/aee44aad-1da5-48b5-8ba9-2e3d60a1c39a-kube-api-access-r5jmh\") pod \"keystone-db-sync-th5nj\" (UID: \"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a\") " pod="glance-kuttl-tests/keystone-db-sync-th5nj" Nov 26 22:54:38 crc kubenswrapper[5008]: I1126 22:54:38.261930 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-th5nj" Nov 26 22:54:38 crc kubenswrapper[5008]: I1126 22:54:38.773761 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-th5nj"] Nov 26 22:54:38 crc kubenswrapper[5008]: W1126 22:54:38.790128 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaee44aad_1da5_48b5_8ba9_2e3d60a1c39a.slice/crio-93bc0ee3773623fae6638eeb66b6e09e558b33daa8557db44f4d2ad249f3c3d9 WatchSource:0}: Error finding container 93bc0ee3773623fae6638eeb66b6e09e558b33daa8557db44f4d2ad249f3c3d9: Status 404 returned error can't find the container with id 93bc0ee3773623fae6638eeb66b6e09e558b33daa8557db44f4d2ad249f3c3d9 Nov 26 22:54:39 crc kubenswrapper[5008]: I1126 22:54:39.789581 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-th5nj" event={"ID":"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a","Type":"ContainerStarted","Data":"93bc0ee3773623fae6638eeb66b6e09e558b33daa8557db44f4d2ad249f3c3d9"} Nov 26 22:54:42 crc kubenswrapper[5008]: I1126 22:54:42.850112 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-rksbv" Nov 26 22:54:42 crc kubenswrapper[5008]: I1126 22:54:42.850636 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-rksbv" Nov 26 22:54:42 crc kubenswrapper[5008]: I1126 22:54:42.886712 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-rksbv" Nov 26 22:54:43 crc kubenswrapper[5008]: I1126 22:54:43.853414 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-rksbv" Nov 26 22:54:46 crc kubenswrapper[5008]: I1126 22:54:46.262948 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-6htcj" Nov 26 22:54:46 crc kubenswrapper[5008]: I1126 22:54:46.263249 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-6htcj" Nov 26 22:54:46 crc kubenswrapper[5008]: I1126 22:54:46.302036 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-6htcj" Nov 26 22:54:46 crc kubenswrapper[5008]: I1126 22:54:46.898592 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-6htcj" Nov 26 22:54:49 crc kubenswrapper[5008]: I1126 22:54:49.864471 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-th5nj" event={"ID":"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a","Type":"ContainerStarted","Data":"9bb0d50ea41fd166fe0b710b447c288a669640cf135fc4f1a31995e55bbb4815"} Nov 26 22:54:49 crc kubenswrapper[5008]: I1126 22:54:49.883412 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-sync-th5nj" podStartSLOduration=2.5201247220000003 podStartE2EDuration="12.883396731s" podCreationTimestamp="2025-11-26 22:54:37 +0000 UTC" firstStartedPulling="2025-11-26 22:54:38.792614987 +0000 UTC m=+954.205309039" lastFinishedPulling="2025-11-26 22:54:49.155887026 +0000 UTC m=+964.568581048" observedRunningTime="2025-11-26 22:54:49.881169181 +0000 UTC m=+965.293863193" watchObservedRunningTime="2025-11-26 22:54:49.883396731 +0000 UTC m=+965.296090733" Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.385447 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667"] Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.387048 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.389704 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6kdt6" Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.399640 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667"] Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.436584 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06d114a8-c14b-4b8b-9467-42f1c99b81a1-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667\" (UID: \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.436798 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06d114a8-c14b-4b8b-9467-42f1c99b81a1-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667\" (UID: \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.436867 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9ch6\" (UniqueName: \"kubernetes.io/projected/06d114a8-c14b-4b8b-9467-42f1c99b81a1-kube-api-access-k9ch6\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667\" (UID: \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.538942 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06d114a8-c14b-4b8b-9467-42f1c99b81a1-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667\" (UID: \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.539077 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06d114a8-c14b-4b8b-9467-42f1c99b81a1-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667\" (UID: \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.539111 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9ch6\" (UniqueName: \"kubernetes.io/projected/06d114a8-c14b-4b8b-9467-42f1c99b81a1-kube-api-access-k9ch6\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667\" (UID: \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.540010 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06d114a8-c14b-4b8b-9467-42f1c99b81a1-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667\" (UID: \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.540505 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06d114a8-c14b-4b8b-9467-42f1c99b81a1-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667\" (UID: \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.562315 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9ch6\" (UniqueName: \"kubernetes.io/projected/06d114a8-c14b-4b8b-9467-42f1c99b81a1-kube-api-access-k9ch6\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667\" (UID: \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" Nov 26 22:54:51 crc kubenswrapper[5008]: I1126 22:54:51.726980 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.035299 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667"] Nov 26 22:54:52 crc kubenswrapper[5008]: W1126 22:54:52.044654 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06d114a8_c14b_4b8b_9467_42f1c99b81a1.slice/crio-dc6c0f74c10cf3f3b15d433462311a31ac6b7131f5725858b6357b3e916826eb WatchSource:0}: Error finding container dc6c0f74c10cf3f3b15d433462311a31ac6b7131f5725858b6357b3e916826eb: Status 404 returned error can't find the container with id dc6c0f74c10cf3f3b15d433462311a31ac6b7131f5725858b6357b3e916826eb Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.374295 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8"] Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.375951 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.391840 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8"] Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.454305 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dbac73a-117b-432a-a7fb-684b58cf31b8-bundle\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8\" (UID: \"0dbac73a-117b-432a-a7fb-684b58cf31b8\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.454375 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dbac73a-117b-432a-a7fb-684b58cf31b8-util\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8\" (UID: \"0dbac73a-117b-432a-a7fb-684b58cf31b8\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.454495 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4whjs\" (UniqueName: \"kubernetes.io/projected/0dbac73a-117b-432a-a7fb-684b58cf31b8-kube-api-access-4whjs\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8\" (UID: \"0dbac73a-117b-432a-a7fb-684b58cf31b8\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.555696 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4whjs\" (UniqueName: \"kubernetes.io/projected/0dbac73a-117b-432a-a7fb-684b58cf31b8-kube-api-access-4whjs\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8\" (UID: \"0dbac73a-117b-432a-a7fb-684b58cf31b8\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.555875 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dbac73a-117b-432a-a7fb-684b58cf31b8-bundle\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8\" (UID: \"0dbac73a-117b-432a-a7fb-684b58cf31b8\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.556633 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dbac73a-117b-432a-a7fb-684b58cf31b8-bundle\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8\" (UID: \"0dbac73a-117b-432a-a7fb-684b58cf31b8\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.556836 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dbac73a-117b-432a-a7fb-684b58cf31b8-util\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8\" (UID: \"0dbac73a-117b-432a-a7fb-684b58cf31b8\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.557198 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dbac73a-117b-432a-a7fb-684b58cf31b8-util\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8\" (UID: \"0dbac73a-117b-432a-a7fb-684b58cf31b8\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.592918 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4whjs\" (UniqueName: \"kubernetes.io/projected/0dbac73a-117b-432a-a7fb-684b58cf31b8-kube-api-access-4whjs\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8\" (UID: \"0dbac73a-117b-432a-a7fb-684b58cf31b8\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.708434 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" Nov 26 22:54:52 crc kubenswrapper[5008]: I1126 22:54:52.888337 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" event={"ID":"06d114a8-c14b-4b8b-9467-42f1c99b81a1","Type":"ContainerStarted","Data":"dc6c0f74c10cf3f3b15d433462311a31ac6b7131f5725858b6357b3e916826eb"} Nov 26 22:54:53 crc kubenswrapper[5008]: I1126 22:54:53.008871 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8"] Nov 26 22:54:53 crc kubenswrapper[5008]: I1126 22:54:53.898453 5008 generic.go:334] "Generic (PLEG): container finished" podID="0dbac73a-117b-432a-a7fb-684b58cf31b8" containerID="4a11dad5f7ae18a4f3d9ed2ef1ac5d92b8ecdc90afc82194428f0c88a06755f7" exitCode=0 Nov 26 22:54:53 crc kubenswrapper[5008]: I1126 22:54:53.898864 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" event={"ID":"0dbac73a-117b-432a-a7fb-684b58cf31b8","Type":"ContainerDied","Data":"4a11dad5f7ae18a4f3d9ed2ef1ac5d92b8ecdc90afc82194428f0c88a06755f7"} Nov 26 22:54:53 crc kubenswrapper[5008]: I1126 22:54:53.898902 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" event={"ID":"0dbac73a-117b-432a-a7fb-684b58cf31b8","Type":"ContainerStarted","Data":"c3c1678953b4d0d8c087e7469d9dc146acbaac805bdfb42f432e3a8f15071c52"} Nov 26 22:54:53 crc kubenswrapper[5008]: I1126 22:54:53.903076 5008 generic.go:334] "Generic (PLEG): container finished" podID="06d114a8-c14b-4b8b-9467-42f1c99b81a1" containerID="8616b0a48ed016190a71d20255ee7a345fd400d5f33f1909d639727283feb6b8" exitCode=0 Nov 26 22:54:53 crc kubenswrapper[5008]: I1126 22:54:53.903113 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" event={"ID":"06d114a8-c14b-4b8b-9467-42f1c99b81a1","Type":"ContainerDied","Data":"8616b0a48ed016190a71d20255ee7a345fd400d5f33f1909d639727283feb6b8"} Nov 26 22:54:54 crc kubenswrapper[5008]: I1126 22:54:54.917681 5008 generic.go:334] "Generic (PLEG): container finished" podID="06d114a8-c14b-4b8b-9467-42f1c99b81a1" containerID="776117e2dd6e488b3e1381d05c78f823b519d46f069c3b71e3be73c1b3ec3dc4" exitCode=0 Nov 26 22:54:54 crc kubenswrapper[5008]: I1126 22:54:54.917756 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" event={"ID":"06d114a8-c14b-4b8b-9467-42f1c99b81a1","Type":"ContainerDied","Data":"776117e2dd6e488b3e1381d05c78f823b519d46f069c3b71e3be73c1b3ec3dc4"} Nov 26 22:54:54 crc kubenswrapper[5008]: I1126 22:54:54.923302 5008 generic.go:334] "Generic (PLEG): container finished" podID="aee44aad-1da5-48b5-8ba9-2e3d60a1c39a" containerID="9bb0d50ea41fd166fe0b710b447c288a669640cf135fc4f1a31995e55bbb4815" exitCode=0 Nov 26 22:54:54 crc kubenswrapper[5008]: I1126 22:54:54.923348 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-th5nj" event={"ID":"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a","Type":"ContainerDied","Data":"9bb0d50ea41fd166fe0b710b447c288a669640cf135fc4f1a31995e55bbb4815"} Nov 26 22:54:55 crc kubenswrapper[5008]: I1126 22:54:55.933638 5008 generic.go:334] "Generic (PLEG): container finished" podID="0dbac73a-117b-432a-a7fb-684b58cf31b8" containerID="e31fdf1e5d4fbbe4c87b271228147137bdb9263094a03d065418b2599bfd55fe" exitCode=0 Nov 26 22:54:55 crc kubenswrapper[5008]: I1126 22:54:55.933788 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" event={"ID":"0dbac73a-117b-432a-a7fb-684b58cf31b8","Type":"ContainerDied","Data":"e31fdf1e5d4fbbe4c87b271228147137bdb9263094a03d065418b2599bfd55fe"} Nov 26 22:54:55 crc kubenswrapper[5008]: I1126 22:54:55.939686 5008 generic.go:334] "Generic (PLEG): container finished" podID="06d114a8-c14b-4b8b-9467-42f1c99b81a1" containerID="7fc46f1ca36d5ec9e77dd3d061973b9a0af5563e0218df9bec53a78e8e69f5d2" exitCode=0 Nov 26 22:54:55 crc kubenswrapper[5008]: I1126 22:54:55.939784 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" event={"ID":"06d114a8-c14b-4b8b-9467-42f1c99b81a1","Type":"ContainerDied","Data":"7fc46f1ca36d5ec9e77dd3d061973b9a0af5563e0218df9bec53a78e8e69f5d2"} Nov 26 22:54:56 crc kubenswrapper[5008]: I1126 22:54:56.305947 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-th5nj" Nov 26 22:54:56 crc kubenswrapper[5008]: I1126 22:54:56.422221 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee44aad-1da5-48b5-8ba9-2e3d60a1c39a-config-data\") pod \"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a\" (UID: \"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a\") " Nov 26 22:54:56 crc kubenswrapper[5008]: I1126 22:54:56.422363 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5jmh\" (UniqueName: \"kubernetes.io/projected/aee44aad-1da5-48b5-8ba9-2e3d60a1c39a-kube-api-access-r5jmh\") pod \"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a\" (UID: \"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a\") " Nov 26 22:54:56 crc kubenswrapper[5008]: I1126 22:54:56.430573 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee44aad-1da5-48b5-8ba9-2e3d60a1c39a-kube-api-access-r5jmh" (OuterVolumeSpecName: "kube-api-access-r5jmh") pod "aee44aad-1da5-48b5-8ba9-2e3d60a1c39a" (UID: "aee44aad-1da5-48b5-8ba9-2e3d60a1c39a"). InnerVolumeSpecName "kube-api-access-r5jmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:54:56 crc kubenswrapper[5008]: I1126 22:54:56.480613 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee44aad-1da5-48b5-8ba9-2e3d60a1c39a-config-data" (OuterVolumeSpecName: "config-data") pod "aee44aad-1da5-48b5-8ba9-2e3d60a1c39a" (UID: "aee44aad-1da5-48b5-8ba9-2e3d60a1c39a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:54:56 crc kubenswrapper[5008]: I1126 22:54:56.524722 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee44aad-1da5-48b5-8ba9-2e3d60a1c39a-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:56 crc kubenswrapper[5008]: I1126 22:54:56.524786 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5jmh\" (UniqueName: \"kubernetes.io/projected/aee44aad-1da5-48b5-8ba9-2e3d60a1c39a-kube-api-access-r5jmh\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:56 crc kubenswrapper[5008]: I1126 22:54:56.950458 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-th5nj" event={"ID":"aee44aad-1da5-48b5-8ba9-2e3d60a1c39a","Type":"ContainerDied","Data":"93bc0ee3773623fae6638eeb66b6e09e558b33daa8557db44f4d2ad249f3c3d9"} Nov 26 22:54:56 crc kubenswrapper[5008]: I1126 22:54:56.950557 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93bc0ee3773623fae6638eeb66b6e09e558b33daa8557db44f4d2ad249f3c3d9" Nov 26 22:54:56 crc kubenswrapper[5008]: I1126 22:54:56.950560 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-th5nj" Nov 26 22:54:56 crc kubenswrapper[5008]: I1126 22:54:56.953043 5008 generic.go:334] "Generic (PLEG): container finished" podID="0dbac73a-117b-432a-a7fb-684b58cf31b8" containerID="01118e0c3c8e0b47e2c3c51e76f5fdc83135ad9f8ab6e8deb5fb46dd428a3b6a" exitCode=0 Nov 26 22:54:56 crc kubenswrapper[5008]: I1126 22:54:56.953881 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" event={"ID":"0dbac73a-117b-432a-a7fb-684b58cf31b8","Type":"ContainerDied","Data":"01118e0c3c8e0b47e2c3c51e76f5fdc83135ad9f8ab6e8deb5fb46dd428a3b6a"} Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.171685 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-vfrf8"] Nov 26 22:54:57 crc kubenswrapper[5008]: E1126 22:54:57.172132 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee44aad-1da5-48b5-8ba9-2e3d60a1c39a" containerName="keystone-db-sync" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.172154 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee44aad-1da5-48b5-8ba9-2e3d60a1c39a" containerName="keystone-db-sync" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.172397 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee44aad-1da5-48b5-8ba9-2e3d60a1c39a" containerName="keystone-db-sync" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.173299 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.180409 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.180718 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.180598 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.180783 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"osp-secret" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.186132 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-vmzv2" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.196480 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-vfrf8"] Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.252272 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhmhp\" (UniqueName: \"kubernetes.io/projected/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-kube-api-access-jhmhp\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.252327 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-credential-keys\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.252403 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-config-data\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.252427 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-scripts\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.252494 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-fernet-keys\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.318266 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.354105 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06d114a8-c14b-4b8b-9467-42f1c99b81a1-util\") pod \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\" (UID: \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\") " Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.354239 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9ch6\" (UniqueName: \"kubernetes.io/projected/06d114a8-c14b-4b8b-9467-42f1c99b81a1-kube-api-access-k9ch6\") pod \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\" (UID: \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\") " Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.354320 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06d114a8-c14b-4b8b-9467-42f1c99b81a1-bundle\") pod \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\" (UID: \"06d114a8-c14b-4b8b-9467-42f1c99b81a1\") " Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.354518 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-scripts\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.355310 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d114a8-c14b-4b8b-9467-42f1c99b81a1-bundle" (OuterVolumeSpecName: "bundle") pod "06d114a8-c14b-4b8b-9467-42f1c99b81a1" (UID: "06d114a8-c14b-4b8b-9467-42f1c99b81a1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.355601 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-fernet-keys\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.355667 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhmhp\" (UniqueName: \"kubernetes.io/projected/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-kube-api-access-jhmhp\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.355705 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-credential-keys\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.355895 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-config-data\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.355992 5008 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06d114a8-c14b-4b8b-9467-42f1c99b81a1-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.359320 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06d114a8-c14b-4b8b-9467-42f1c99b81a1-kube-api-access-k9ch6" (OuterVolumeSpecName: "kube-api-access-k9ch6") pod "06d114a8-c14b-4b8b-9467-42f1c99b81a1" (UID: "06d114a8-c14b-4b8b-9467-42f1c99b81a1"). InnerVolumeSpecName "kube-api-access-k9ch6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.360870 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-fernet-keys\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.361049 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-credential-keys\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.364304 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-config-data\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.371798 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-scripts\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.375547 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhmhp\" (UniqueName: \"kubernetes.io/projected/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-kube-api-access-jhmhp\") pod \"keystone-bootstrap-vfrf8\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.385156 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d114a8-c14b-4b8b-9467-42f1c99b81a1-util" (OuterVolumeSpecName: "util") pod "06d114a8-c14b-4b8b-9467-42f1c99b81a1" (UID: "06d114a8-c14b-4b8b-9467-42f1c99b81a1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.457632 5008 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06d114a8-c14b-4b8b-9467-42f1c99b81a1-util\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.457687 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9ch6\" (UniqueName: \"kubernetes.io/projected/06d114a8-c14b-4b8b-9467-42f1c99b81a1-kube-api-access-k9ch6\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.506471 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.967747 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" event={"ID":"06d114a8-c14b-4b8b-9467-42f1c99b81a1","Type":"ContainerDied","Data":"dc6c0f74c10cf3f3b15d433462311a31ac6b7131f5725858b6357b3e916826eb"} Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.968192 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc6c0f74c10cf3f3b15d433462311a31ac6b7131f5725858b6357b3e916826eb" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.968034 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368c2z667" Nov 26 22:54:57 crc kubenswrapper[5008]: I1126 22:54:57.997439 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-vfrf8"] Nov 26 22:54:58 crc kubenswrapper[5008]: W1126 22:54:58.007790 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab8e35ac_3cb8_4193_b35f_8826a3db3ea3.slice/crio-1ea721ca6f8cd9acd4b9dfc2c9f9bc5148f98d745aee2d830c4c38530aecd024 WatchSource:0}: Error finding container 1ea721ca6f8cd9acd4b9dfc2c9f9bc5148f98d745aee2d830c4c38530aecd024: Status 404 returned error can't find the container with id 1ea721ca6f8cd9acd4b9dfc2c9f9bc5148f98d745aee2d830c4c38530aecd024 Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.347095 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.374809 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4whjs\" (UniqueName: \"kubernetes.io/projected/0dbac73a-117b-432a-a7fb-684b58cf31b8-kube-api-access-4whjs\") pod \"0dbac73a-117b-432a-a7fb-684b58cf31b8\" (UID: \"0dbac73a-117b-432a-a7fb-684b58cf31b8\") " Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.374901 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dbac73a-117b-432a-a7fb-684b58cf31b8-bundle\") pod \"0dbac73a-117b-432a-a7fb-684b58cf31b8\" (UID: \"0dbac73a-117b-432a-a7fb-684b58cf31b8\") " Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.375061 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dbac73a-117b-432a-a7fb-684b58cf31b8-util\") pod \"0dbac73a-117b-432a-a7fb-684b58cf31b8\" (UID: \"0dbac73a-117b-432a-a7fb-684b58cf31b8\") " Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.375952 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dbac73a-117b-432a-a7fb-684b58cf31b8-bundle" (OuterVolumeSpecName: "bundle") pod "0dbac73a-117b-432a-a7fb-684b58cf31b8" (UID: "0dbac73a-117b-432a-a7fb-684b58cf31b8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.379519 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbac73a-117b-432a-a7fb-684b58cf31b8-kube-api-access-4whjs" (OuterVolumeSpecName: "kube-api-access-4whjs") pod "0dbac73a-117b-432a-a7fb-684b58cf31b8" (UID: "0dbac73a-117b-432a-a7fb-684b58cf31b8"). InnerVolumeSpecName "kube-api-access-4whjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.387618 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dbac73a-117b-432a-a7fb-684b58cf31b8-util" (OuterVolumeSpecName: "util") pod "0dbac73a-117b-432a-a7fb-684b58cf31b8" (UID: "0dbac73a-117b-432a-a7fb-684b58cf31b8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.477080 5008 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dbac73a-117b-432a-a7fb-684b58cf31b8-util\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.477129 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4whjs\" (UniqueName: \"kubernetes.io/projected/0dbac73a-117b-432a-a7fb-684b58cf31b8-kube-api-access-4whjs\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.477153 5008 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dbac73a-117b-432a-a7fb-684b58cf31b8-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.980442 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" event={"ID":"0dbac73a-117b-432a-a7fb-684b58cf31b8","Type":"ContainerDied","Data":"c3c1678953b4d0d8c087e7469d9dc146acbaac805bdfb42f432e3a8f15071c52"} Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.980487 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3c1678953b4d0d8c087e7469d9dc146acbaac805bdfb42f432e3a8f15071c52" Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.980524 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b464j8" Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.982885 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" event={"ID":"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3","Type":"ContainerStarted","Data":"75595be1d567ee1084581410f6fcbffc1d72aea47440e3caa7e5f1bc81dd3d84"} Nov 26 22:54:58 crc kubenswrapper[5008]: I1126 22:54:58.982994 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" event={"ID":"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3","Type":"ContainerStarted","Data":"1ea721ca6f8cd9acd4b9dfc2c9f9bc5148f98d745aee2d830c4c38530aecd024"} Nov 26 22:54:59 crc kubenswrapper[5008]: I1126 22:54:59.023420 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" podStartSLOduration=2.0233931 podStartE2EDuration="2.0233931s" podCreationTimestamp="2025-11-26 22:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:54:59.004368143 +0000 UTC m=+974.417062165" watchObservedRunningTime="2025-11-26 22:54:59.0233931 +0000 UTC m=+974.436087142" Nov 26 22:54:59 crc kubenswrapper[5008]: I1126 22:54:59.281583 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:54:59 crc kubenswrapper[5008]: I1126 22:54:59.282068 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:54:59 crc kubenswrapper[5008]: I1126 22:54:59.282136 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:54:59 crc kubenswrapper[5008]: I1126 22:54:59.282960 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7996ba3586721dd9a32b1864cf2a2c0cd8e592ad6992d5bba7e5d7e532bd504c"} pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 22:54:59 crc kubenswrapper[5008]: I1126 22:54:59.283140 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" containerID="cri-o://7996ba3586721dd9a32b1864cf2a2c0cd8e592ad6992d5bba7e5d7e532bd504c" gracePeriod=600 Nov 26 22:54:59 crc kubenswrapper[5008]: I1126 22:54:59.991364 5008 generic.go:334] "Generic (PLEG): container finished" podID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerID="7996ba3586721dd9a32b1864cf2a2c0cd8e592ad6992d5bba7e5d7e532bd504c" exitCode=0 Nov 26 22:54:59 crc kubenswrapper[5008]: I1126 22:54:59.991483 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerDied","Data":"7996ba3586721dd9a32b1864cf2a2c0cd8e592ad6992d5bba7e5d7e532bd504c"} Nov 26 22:54:59 crc kubenswrapper[5008]: I1126 22:54:59.992575 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerStarted","Data":"9fe41703f50fb88cb25a2da55af0500e1caf0de96a6d4c9a6682af82c31219e6"} Nov 26 22:54:59 crc kubenswrapper[5008]: I1126 22:54:59.992615 5008 scope.go:117] "RemoveContainer" containerID="52afabcce87945b7bdb95fab76560c9840f1056b4d5ad19b5a36a6c1c1f5fb46" Nov 26 22:55:01 crc kubenswrapper[5008]: I1126 22:55:01.009260 5008 generic.go:334] "Generic (PLEG): container finished" podID="ab8e35ac-3cb8-4193-b35f-8826a3db3ea3" containerID="75595be1d567ee1084581410f6fcbffc1d72aea47440e3caa7e5f1bc81dd3d84" exitCode=0 Nov 26 22:55:01 crc kubenswrapper[5008]: I1126 22:55:01.009355 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" event={"ID":"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3","Type":"ContainerDied","Data":"75595be1d567ee1084581410f6fcbffc1d72aea47440e3caa7e5f1bc81dd3d84"} Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.269662 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.337951 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-scripts\") pod \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.338029 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-credential-keys\") pod \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.338092 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-fernet-keys\") pod \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.338188 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-config-data\") pod \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.338224 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhmhp\" (UniqueName: \"kubernetes.io/projected/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-kube-api-access-jhmhp\") pod \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\" (UID: \"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3\") " Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.343407 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-scripts" (OuterVolumeSpecName: "scripts") pod "ab8e35ac-3cb8-4193-b35f-8826a3db3ea3" (UID: "ab8e35ac-3cb8-4193-b35f-8826a3db3ea3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.343565 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ab8e35ac-3cb8-4193-b35f-8826a3db3ea3" (UID: "ab8e35ac-3cb8-4193-b35f-8826a3db3ea3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.344286 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-kube-api-access-jhmhp" (OuterVolumeSpecName: "kube-api-access-jhmhp") pod "ab8e35ac-3cb8-4193-b35f-8826a3db3ea3" (UID: "ab8e35ac-3cb8-4193-b35f-8826a3db3ea3"). InnerVolumeSpecName "kube-api-access-jhmhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.345867 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ab8e35ac-3cb8-4193-b35f-8826a3db3ea3" (UID: "ab8e35ac-3cb8-4193-b35f-8826a3db3ea3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.357268 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-config-data" (OuterVolumeSpecName: "config-data") pod "ab8e35ac-3cb8-4193-b35f-8826a3db3ea3" (UID: "ab8e35ac-3cb8-4193-b35f-8826a3db3ea3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.439684 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.439723 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhmhp\" (UniqueName: \"kubernetes.io/projected/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-kube-api-access-jhmhp\") on node \"crc\" DevicePath \"\"" Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.439739 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.439751 5008 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 26 22:55:02 crc kubenswrapper[5008]: I1126 22:55:02.439763 5008 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab8e35ac-3cb8-4193-b35f-8826a3db3ea3-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.026615 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" event={"ID":"ab8e35ac-3cb8-4193-b35f-8826a3db3ea3","Type":"ContainerDied","Data":"1ea721ca6f8cd9acd4b9dfc2c9f9bc5148f98d745aee2d830c4c38530aecd024"} Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.026671 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ea721ca6f8cd9acd4b9dfc2c9f9bc5148f98d745aee2d830c4c38530aecd024" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.026711 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-vfrf8" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.156555 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-74b9d76575-dkv4w"] Nov 26 22:55:03 crc kubenswrapper[5008]: E1126 22:55:03.157095 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab8e35ac-3cb8-4193-b35f-8826a3db3ea3" containerName="keystone-bootstrap" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.157108 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab8e35ac-3cb8-4193-b35f-8826a3db3ea3" containerName="keystone-bootstrap" Nov 26 22:55:03 crc kubenswrapper[5008]: E1126 22:55:03.157119 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbac73a-117b-432a-a7fb-684b58cf31b8" containerName="extract" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.157125 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbac73a-117b-432a-a7fb-684b58cf31b8" containerName="extract" Nov 26 22:55:03 crc kubenswrapper[5008]: E1126 22:55:03.157135 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbac73a-117b-432a-a7fb-684b58cf31b8" containerName="pull" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.157141 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbac73a-117b-432a-a7fb-684b58cf31b8" containerName="pull" Nov 26 22:55:03 crc kubenswrapper[5008]: E1126 22:55:03.157147 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d114a8-c14b-4b8b-9467-42f1c99b81a1" containerName="extract" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.157153 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d114a8-c14b-4b8b-9467-42f1c99b81a1" containerName="extract" Nov 26 22:55:03 crc kubenswrapper[5008]: E1126 22:55:03.157168 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d114a8-c14b-4b8b-9467-42f1c99b81a1" containerName="util" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.157174 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d114a8-c14b-4b8b-9467-42f1c99b81a1" containerName="util" Nov 26 22:55:03 crc kubenswrapper[5008]: E1126 22:55:03.157186 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d114a8-c14b-4b8b-9467-42f1c99b81a1" containerName="pull" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.157191 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d114a8-c14b-4b8b-9467-42f1c99b81a1" containerName="pull" Nov 26 22:55:03 crc kubenswrapper[5008]: E1126 22:55:03.157199 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbac73a-117b-432a-a7fb-684b58cf31b8" containerName="util" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.157205 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbac73a-117b-432a-a7fb-684b58cf31b8" containerName="util" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.157317 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d114a8-c14b-4b8b-9467-42f1c99b81a1" containerName="extract" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.157331 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dbac73a-117b-432a-a7fb-684b58cf31b8" containerName="extract" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.157338 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab8e35ac-3cb8-4193-b35f-8826a3db3ea3" containerName="keystone-bootstrap" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.157797 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.160556 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.161117 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-vmzv2" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.161433 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.165609 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.179535 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-74b9d76575-dkv4w"] Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.254453 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-fernet-keys\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.254818 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-scripts\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.255205 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-config-data\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.255375 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-credential-keys\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.255576 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4g79\" (UniqueName: \"kubernetes.io/projected/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-kube-api-access-f4g79\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.369048 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-config-data\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.369098 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-credential-keys\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.369167 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4g79\" (UniqueName: \"kubernetes.io/projected/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-kube-api-access-f4g79\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.369227 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-fernet-keys\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.369270 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-scripts\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.374811 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-config-data\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.375545 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-fernet-keys\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.376260 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-credential-keys\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.378938 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-scripts\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.396525 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4g79\" (UniqueName: \"kubernetes.io/projected/1138ccfc-7f19-4b1d-aeae-8a37fbb21637-kube-api-access-f4g79\") pod \"keystone-74b9d76575-dkv4w\" (UID: \"1138ccfc-7f19-4b1d-aeae-8a37fbb21637\") " pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.518261 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:03 crc kubenswrapper[5008]: I1126 22:55:03.998323 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-74b9d76575-dkv4w"] Nov 26 22:55:04 crc kubenswrapper[5008]: I1126 22:55:04.040187 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" event={"ID":"1138ccfc-7f19-4b1d-aeae-8a37fbb21637","Type":"ContainerStarted","Data":"5c2043bc56a4e2b71cb6db2c6e4c863a6c8d580ccff1e349b2884bdd40ec3f1e"} Nov 26 22:55:05 crc kubenswrapper[5008]: I1126 22:55:05.049097 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" event={"ID":"1138ccfc-7f19-4b1d-aeae-8a37fbb21637","Type":"ContainerStarted","Data":"30866ccfa7ea51ff4d1cabc80d0d84c0d52bb3e35560723f5ac088e13cf03ef3"} Nov 26 22:55:05 crc kubenswrapper[5008]: I1126 22:55:05.049370 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:05 crc kubenswrapper[5008]: I1126 22:55:05.076405 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" podStartSLOduration=2.076384478 podStartE2EDuration="2.076384478s" podCreationTimestamp="2025-11-26 22:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:55:05.072149966 +0000 UTC m=+980.484843978" watchObservedRunningTime="2025-11-26 22:55:05.076384478 +0000 UTC m=+980.489078480" Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.340434 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl"] Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.342022 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.343518 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.345313 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qf5sb" Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.362826 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl"] Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.459396 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0280585a-1314-4eac-9fc6-d83aa687a4f4-apiservice-cert\") pod \"horizon-operator-controller-manager-5765b658-hkfvl\" (UID: \"0280585a-1314-4eac-9fc6-d83aa687a4f4\") " pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.459443 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmjhz\" (UniqueName: \"kubernetes.io/projected/0280585a-1314-4eac-9fc6-d83aa687a4f4-kube-api-access-cmjhz\") pod \"horizon-operator-controller-manager-5765b658-hkfvl\" (UID: \"0280585a-1314-4eac-9fc6-d83aa687a4f4\") " pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.459498 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0280585a-1314-4eac-9fc6-d83aa687a4f4-webhook-cert\") pod \"horizon-operator-controller-manager-5765b658-hkfvl\" (UID: \"0280585a-1314-4eac-9fc6-d83aa687a4f4\") " pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.560730 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0280585a-1314-4eac-9fc6-d83aa687a4f4-apiservice-cert\") pod \"horizon-operator-controller-manager-5765b658-hkfvl\" (UID: \"0280585a-1314-4eac-9fc6-d83aa687a4f4\") " pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.560796 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmjhz\" (UniqueName: \"kubernetes.io/projected/0280585a-1314-4eac-9fc6-d83aa687a4f4-kube-api-access-cmjhz\") pod \"horizon-operator-controller-manager-5765b658-hkfvl\" (UID: \"0280585a-1314-4eac-9fc6-d83aa687a4f4\") " pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.560875 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0280585a-1314-4eac-9fc6-d83aa687a4f4-webhook-cert\") pod \"horizon-operator-controller-manager-5765b658-hkfvl\" (UID: \"0280585a-1314-4eac-9fc6-d83aa687a4f4\") " pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.566222 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0280585a-1314-4eac-9fc6-d83aa687a4f4-apiservice-cert\") pod \"horizon-operator-controller-manager-5765b658-hkfvl\" (UID: \"0280585a-1314-4eac-9fc6-d83aa687a4f4\") " pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.566250 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0280585a-1314-4eac-9fc6-d83aa687a4f4-webhook-cert\") pod \"horizon-operator-controller-manager-5765b658-hkfvl\" (UID: \"0280585a-1314-4eac-9fc6-d83aa687a4f4\") " pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.581629 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmjhz\" (UniqueName: \"kubernetes.io/projected/0280585a-1314-4eac-9fc6-d83aa687a4f4-kube-api-access-cmjhz\") pod \"horizon-operator-controller-manager-5765b658-hkfvl\" (UID: \"0280585a-1314-4eac-9fc6-d83aa687a4f4\") " pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 22:55:14 crc kubenswrapper[5008]: I1126 22:55:14.660738 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 22:55:15 crc kubenswrapper[5008]: I1126 22:55:15.083483 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl"] Nov 26 22:55:15 crc kubenswrapper[5008]: W1126 22:55:15.084929 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0280585a_1314_4eac_9fc6_d83aa687a4f4.slice/crio-b20b0f36e53fd5b8d7a53b8ffdd437b93b9b701876fb31db889c2a6ad6f70dfc WatchSource:0}: Error finding container b20b0f36e53fd5b8d7a53b8ffdd437b93b9b701876fb31db889c2a6ad6f70dfc: Status 404 returned error can't find the container with id b20b0f36e53fd5b8d7a53b8ffdd437b93b9b701876fb31db889c2a6ad6f70dfc Nov 26 22:55:15 crc kubenswrapper[5008]: I1126 22:55:15.126176 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" event={"ID":"0280585a-1314-4eac-9fc6-d83aa687a4f4","Type":"ContainerStarted","Data":"b20b0f36e53fd5b8d7a53b8ffdd437b93b9b701876fb31db889c2a6ad6f70dfc"} Nov 26 22:55:18 crc kubenswrapper[5008]: I1126 22:55:18.147921 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" event={"ID":"0280585a-1314-4eac-9fc6-d83aa687a4f4","Type":"ContainerStarted","Data":"8259b0b4910f9a2fe20ae8be5de9bfc9596aa13c2cf2425ec7d1c1c2daa0662c"} Nov 26 22:55:18 crc kubenswrapper[5008]: I1126 22:55:18.148515 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 22:55:18 crc kubenswrapper[5008]: I1126 22:55:18.175045 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" podStartSLOduration=2.090243708 podStartE2EDuration="4.174939249s" podCreationTimestamp="2025-11-26 22:55:14 +0000 UTC" firstStartedPulling="2025-11-26 22:55:15.088477454 +0000 UTC m=+990.501171456" lastFinishedPulling="2025-11-26 22:55:17.173172995 +0000 UTC m=+992.585866997" observedRunningTime="2025-11-26 22:55:18.166783162 +0000 UTC m=+993.579477204" watchObservedRunningTime="2025-11-26 22:55:18.174939249 +0000 UTC m=+993.587633281" Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.702689 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk"] Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.703713 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.706399 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-fz8w6" Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.707530 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.766786 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk"] Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.850366 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jh4n\" (UniqueName: \"kubernetes.io/projected/aae33cf9-f71c-4878-86c4-218de3173f3a-kube-api-access-8jh4n\") pod \"swift-operator-controller-manager-86cc6c797c-xj5wk\" (UID: \"aae33cf9-f71c-4878-86c4-218de3173f3a\") " pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.850429 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aae33cf9-f71c-4878-86c4-218de3173f3a-webhook-cert\") pod \"swift-operator-controller-manager-86cc6c797c-xj5wk\" (UID: \"aae33cf9-f71c-4878-86c4-218de3173f3a\") " pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.850656 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aae33cf9-f71c-4878-86c4-218de3173f3a-apiservice-cert\") pod \"swift-operator-controller-manager-86cc6c797c-xj5wk\" (UID: \"aae33cf9-f71c-4878-86c4-218de3173f3a\") " pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.952081 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aae33cf9-f71c-4878-86c4-218de3173f3a-apiservice-cert\") pod \"swift-operator-controller-manager-86cc6c797c-xj5wk\" (UID: \"aae33cf9-f71c-4878-86c4-218de3173f3a\") " pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.952182 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jh4n\" (UniqueName: \"kubernetes.io/projected/aae33cf9-f71c-4878-86c4-218de3173f3a-kube-api-access-8jh4n\") pod \"swift-operator-controller-manager-86cc6c797c-xj5wk\" (UID: \"aae33cf9-f71c-4878-86c4-218de3173f3a\") " pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.952238 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aae33cf9-f71c-4878-86c4-218de3173f3a-webhook-cert\") pod \"swift-operator-controller-manager-86cc6c797c-xj5wk\" (UID: \"aae33cf9-f71c-4878-86c4-218de3173f3a\") " pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.962267 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aae33cf9-f71c-4878-86c4-218de3173f3a-webhook-cert\") pod \"swift-operator-controller-manager-86cc6c797c-xj5wk\" (UID: \"aae33cf9-f71c-4878-86c4-218de3173f3a\") " pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.967205 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aae33cf9-f71c-4878-86c4-218de3173f3a-apiservice-cert\") pod \"swift-operator-controller-manager-86cc6c797c-xj5wk\" (UID: \"aae33cf9-f71c-4878-86c4-218de3173f3a\") " pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 22:55:20 crc kubenswrapper[5008]: I1126 22:55:20.982265 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jh4n\" (UniqueName: \"kubernetes.io/projected/aae33cf9-f71c-4878-86c4-218de3173f3a-kube-api-access-8jh4n\") pod \"swift-operator-controller-manager-86cc6c797c-xj5wk\" (UID: \"aae33cf9-f71c-4878-86c4-218de3173f3a\") " pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 22:55:21 crc kubenswrapper[5008]: I1126 22:55:21.032707 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 22:55:21 crc kubenswrapper[5008]: I1126 22:55:21.287249 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk"] Nov 26 22:55:21 crc kubenswrapper[5008]: I1126 22:55:21.297362 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 22:55:22 crc kubenswrapper[5008]: I1126 22:55:22.183366 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" event={"ID":"aae33cf9-f71c-4878-86c4-218de3173f3a","Type":"ContainerStarted","Data":"7e4c4258bc872505adfb6eb2b74303abe076a5b885495c03ec0ac0b5b133053b"} Nov 26 22:55:24 crc kubenswrapper[5008]: I1126 22:55:24.201288 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" event={"ID":"aae33cf9-f71c-4878-86c4-218de3173f3a","Type":"ContainerStarted","Data":"87768de0774080581a5c60ce4320bcfb5daa74d521e7e6e17601ed50b0ab799f"} Nov 26 22:55:24 crc kubenswrapper[5008]: I1126 22:55:24.201569 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 22:55:24 crc kubenswrapper[5008]: I1126 22:55:24.219450 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" podStartSLOduration=1.884946022 podStartE2EDuration="4.21943422s" podCreationTimestamp="2025-11-26 22:55:20 +0000 UTC" firstStartedPulling="2025-11-26 22:55:21.296807354 +0000 UTC m=+996.709501356" lastFinishedPulling="2025-11-26 22:55:23.631295552 +0000 UTC m=+999.043989554" observedRunningTime="2025-11-26 22:55:24.216066715 +0000 UTC m=+999.628760717" watchObservedRunningTime="2025-11-26 22:55:24.21943422 +0000 UTC m=+999.632128222" Nov 26 22:55:24 crc kubenswrapper[5008]: I1126 22:55:24.665628 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 22:55:31 crc kubenswrapper[5008]: I1126 22:55:31.040788 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 22:55:34 crc kubenswrapper[5008]: I1126 22:55:34.906497 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/keystone-74b9d76575-dkv4w" Nov 26 22:55:36 crc kubenswrapper[5008]: I1126 22:55:36.923548 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Nov 26 22:55:36 crc kubenswrapper[5008]: I1126 22:55:36.927717 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:36 crc kubenswrapper[5008]: I1126 22:55:36.930738 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Nov 26 22:55:36 crc kubenswrapper[5008]: I1126 22:55:36.930751 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Nov 26 22:55:36 crc kubenswrapper[5008]: I1126 22:55:36.930883 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Nov 26 22:55:36 crc kubenswrapper[5008]: I1126 22:55:36.931709 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-995x9" Nov 26 22:55:36 crc kubenswrapper[5008]: I1126 22:55:36.945076 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.027436 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.027536 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.027569 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkx2x\" (UniqueName: \"kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-kube-api-access-zkx2x\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.027592 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-cache\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.027608 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-lock\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.128621 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: E1126 22:55:37.128840 5008 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 22:55:37 crc kubenswrapper[5008]: E1126 22:55:37.129016 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 26 22:55:37 crc kubenswrapper[5008]: E1126 22:55:37.129075 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift podName:b90153ab-a3cc-4cc8-9d59-5813b2e8d550 nodeName:}" failed. No retries permitted until 2025-11-26 22:55:37.629055705 +0000 UTC m=+1013.041749707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift") pod "swift-storage-0" (UID: "b90153ab-a3cc-4cc8-9d59-5813b2e8d550") : configmap "swift-ring-files" not found Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.129355 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.129641 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkx2x\" (UniqueName: \"kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-kube-api-access-zkx2x\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.129873 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-cache\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.129699 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.130068 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-lock\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.130368 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-cache\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.130952 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-lock\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.157120 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkx2x\" (UniqueName: \"kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-kube-api-access-zkx2x\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.167339 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.471007 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-lk2l2"] Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.472861 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.475858 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.476290 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-config-data" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.477475 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-scripts" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.483607 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-lk2l2"] Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.535145 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c006b487-ea33-4b37-b3fa-2105efbf7717-dispersionconf\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.535204 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c006b487-ea33-4b37-b3fa-2105efbf7717-scripts\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.535245 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c006b487-ea33-4b37-b3fa-2105efbf7717-etc-swift\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.535270 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c006b487-ea33-4b37-b3fa-2105efbf7717-ring-data-devices\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.535292 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l45fj\" (UniqueName: \"kubernetes.io/projected/c006b487-ea33-4b37-b3fa-2105efbf7717-kube-api-access-l45fj\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.535383 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c006b487-ea33-4b37-b3fa-2105efbf7717-swiftconf\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.636342 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.636415 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c006b487-ea33-4b37-b3fa-2105efbf7717-swiftconf\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.636482 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c006b487-ea33-4b37-b3fa-2105efbf7717-dispersionconf\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.636514 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c006b487-ea33-4b37-b3fa-2105efbf7717-scripts\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.636554 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c006b487-ea33-4b37-b3fa-2105efbf7717-etc-swift\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.636580 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c006b487-ea33-4b37-b3fa-2105efbf7717-ring-data-devices\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.636605 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l45fj\" (UniqueName: \"kubernetes.io/projected/c006b487-ea33-4b37-b3fa-2105efbf7717-kube-api-access-l45fj\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: E1126 22:55:37.636553 5008 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 22:55:37 crc kubenswrapper[5008]: E1126 22:55:37.637080 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 26 22:55:37 crc kubenswrapper[5008]: E1126 22:55:37.637129 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift podName:b90153ab-a3cc-4cc8-9d59-5813b2e8d550 nodeName:}" failed. No retries permitted until 2025-11-26 22:55:38.637111252 +0000 UTC m=+1014.049805254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift") pod "swift-storage-0" (UID: "b90153ab-a3cc-4cc8-9d59-5813b2e8d550") : configmap "swift-ring-files" not found Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.637539 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c006b487-ea33-4b37-b3fa-2105efbf7717-scripts\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.638026 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c006b487-ea33-4b37-b3fa-2105efbf7717-etc-swift\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.638187 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c006b487-ea33-4b37-b3fa-2105efbf7717-ring-data-devices\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.642278 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c006b487-ea33-4b37-b3fa-2105efbf7717-dispersionconf\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.642455 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c006b487-ea33-4b37-b3fa-2105efbf7717-swiftconf\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.658391 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l45fj\" (UniqueName: \"kubernetes.io/projected/c006b487-ea33-4b37-b3fa-2105efbf7717-kube-api-access-l45fj\") pod \"swift-ring-rebalance-lk2l2\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:37 crc kubenswrapper[5008]: I1126 22:55:37.793990 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.241099 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-lk2l2"] Nov 26 22:55:38 crc kubenswrapper[5008]: W1126 22:55:38.252295 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc006b487_ea33_4b37_b3fa_2105efbf7717.slice/crio-1093f923e65acdc6e0e8b16fdff5dd32ffe6c471658187c486e8b9cd1f7504b3 WatchSource:0}: Error finding container 1093f923e65acdc6e0e8b16fdff5dd32ffe6c471658187c486e8b9cd1f7504b3: Status 404 returned error can't find the container with id 1093f923e65acdc6e0e8b16fdff5dd32ffe6c471658187c486e8b9cd1f7504b3 Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.307501 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" event={"ID":"c006b487-ea33-4b37-b3fa-2105efbf7717","Type":"ContainerStarted","Data":"1093f923e65acdc6e0e8b16fdff5dd32ffe6c471658187c486e8b9cd1f7504b3"} Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.339578 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-6dg96"] Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.340515 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-6dg96" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.345201 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-8fx9r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.347276 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-6dg96"] Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.445459 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lczw7\" (UniqueName: \"kubernetes.io/projected/4b2dec9b-659a-40df-948c-3fd70d7e4d24-kube-api-access-lczw7\") pod \"glance-operator-index-6dg96\" (UID: \"4b2dec9b-659a-40df-948c-3fd70d7e4d24\") " pod="openstack-operators/glance-operator-index-6dg96" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.546601 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lczw7\" (UniqueName: \"kubernetes.io/projected/4b2dec9b-659a-40df-948c-3fd70d7e4d24-kube-api-access-lczw7\") pod \"glance-operator-index-6dg96\" (UID: \"4b2dec9b-659a-40df-948c-3fd70d7e4d24\") " pod="openstack-operators/glance-operator-index-6dg96" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.566802 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lczw7\" (UniqueName: \"kubernetes.io/projected/4b2dec9b-659a-40df-948c-3fd70d7e4d24-kube-api-access-lczw7\") pod \"glance-operator-index-6dg96\" (UID: \"4b2dec9b-659a-40df-948c-3fd70d7e4d24\") " pod="openstack-operators/glance-operator-index-6dg96" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.642247 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r"] Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.643448 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.647652 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:38 crc kubenswrapper[5008]: E1126 22:55:38.647822 5008 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 22:55:38 crc kubenswrapper[5008]: E1126 22:55:38.647843 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 26 22:55:38 crc kubenswrapper[5008]: E1126 22:55:38.647895 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift podName:b90153ab-a3cc-4cc8-9d59-5813b2e8d550 nodeName:}" failed. No retries permitted until 2025-11-26 22:55:40.647879617 +0000 UTC m=+1016.060573629 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift") pod "swift-storage-0" (UID: "b90153ab-a3cc-4cc8-9d59-5813b2e8d550") : configmap "swift-ring-files" not found Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.654827 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r"] Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.666791 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-6dg96" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.749898 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.750213 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2z5j\" (UniqueName: \"kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-kube-api-access-c2z5j\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.750244 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1931c8a4-5c2e-4c34-9615-29e169ec5f45-config-data\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.750281 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1931c8a4-5c2e-4c34-9615-29e169ec5f45-log-httpd\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.750355 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1931c8a4-5c2e-4c34-9615-29e169ec5f45-run-httpd\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.851863 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1931c8a4-5c2e-4c34-9615-29e169ec5f45-config-data\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.851932 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1931c8a4-5c2e-4c34-9615-29e169ec5f45-log-httpd\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.852034 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1931c8a4-5c2e-4c34-9615-29e169ec5f45-run-httpd\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.852093 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.852124 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2z5j\" (UniqueName: \"kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-kube-api-access-c2z5j\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.852945 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1931c8a4-5c2e-4c34-9615-29e169ec5f45-log-httpd\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.853010 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1931c8a4-5c2e-4c34-9615-29e169ec5f45-run-httpd\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: E1126 22:55:38.853050 5008 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 22:55:38 crc kubenswrapper[5008]: E1126 22:55:38.853066 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r: configmap "swift-ring-files" not found Nov 26 22:55:38 crc kubenswrapper[5008]: E1126 22:55:38.853104 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift podName:1931c8a4-5c2e-4c34-9615-29e169ec5f45 nodeName:}" failed. No retries permitted until 2025-11-26 22:55:39.353089593 +0000 UTC m=+1014.765783605 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift") pod "swift-proxy-6bd58cfcf7-gvt6r" (UID: "1931c8a4-5c2e-4c34-9615-29e169ec5f45") : configmap "swift-ring-files" not found Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.857191 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1931c8a4-5c2e-4c34-9615-29e169ec5f45-config-data\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:38 crc kubenswrapper[5008]: I1126 22:55:38.883206 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2z5j\" (UniqueName: \"kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-kube-api-access-c2z5j\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:39 crc kubenswrapper[5008]: I1126 22:55:39.076405 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-6dg96"] Nov 26 22:55:39 crc kubenswrapper[5008]: I1126 22:55:39.318319 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-6dg96" event={"ID":"4b2dec9b-659a-40df-948c-3fd70d7e4d24","Type":"ContainerStarted","Data":"0478f5844315556789f62d3534df7aae4712486071a74a100d1b48010c49bde4"} Nov 26 22:55:39 crc kubenswrapper[5008]: I1126 22:55:39.360885 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:39 crc kubenswrapper[5008]: E1126 22:55:39.361296 5008 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 22:55:39 crc kubenswrapper[5008]: E1126 22:55:39.361342 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r: configmap "swift-ring-files" not found Nov 26 22:55:39 crc kubenswrapper[5008]: E1126 22:55:39.361590 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift podName:1931c8a4-5c2e-4c34-9615-29e169ec5f45 nodeName:}" failed. No retries permitted until 2025-11-26 22:55:40.361574154 +0000 UTC m=+1015.774268156 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift") pod "swift-proxy-6bd58cfcf7-gvt6r" (UID: "1931c8a4-5c2e-4c34-9615-29e169ec5f45") : configmap "swift-ring-files" not found Nov 26 22:55:40 crc kubenswrapper[5008]: I1126 22:55:40.381742 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:40 crc kubenswrapper[5008]: E1126 22:55:40.381949 5008 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 22:55:40 crc kubenswrapper[5008]: E1126 22:55:40.381992 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r: configmap "swift-ring-files" not found Nov 26 22:55:40 crc kubenswrapper[5008]: E1126 22:55:40.382051 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift podName:1931c8a4-5c2e-4c34-9615-29e169ec5f45 nodeName:}" failed. No retries permitted until 2025-11-26 22:55:42.382033323 +0000 UTC m=+1017.794727325 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift") pod "swift-proxy-6bd58cfcf7-gvt6r" (UID: "1931c8a4-5c2e-4c34-9615-29e169ec5f45") : configmap "swift-ring-files" not found Nov 26 22:55:40 crc kubenswrapper[5008]: I1126 22:55:40.686346 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:40 crc kubenswrapper[5008]: E1126 22:55:40.686484 5008 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 22:55:40 crc kubenswrapper[5008]: E1126 22:55:40.686496 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 26 22:55:40 crc kubenswrapper[5008]: E1126 22:55:40.686542 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift podName:b90153ab-a3cc-4cc8-9d59-5813b2e8d550 nodeName:}" failed. No retries permitted until 2025-11-26 22:55:44.686529204 +0000 UTC m=+1020.099223206 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift") pod "swift-storage-0" (UID: "b90153ab-a3cc-4cc8-9d59-5813b2e8d550") : configmap "swift-ring-files" not found Nov 26 22:55:42 crc kubenswrapper[5008]: I1126 22:55:42.416009 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:42 crc kubenswrapper[5008]: E1126 22:55:42.416249 5008 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 22:55:42 crc kubenswrapper[5008]: E1126 22:55:42.416600 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r: configmap "swift-ring-files" not found Nov 26 22:55:42 crc kubenswrapper[5008]: E1126 22:55:42.416673 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift podName:1931c8a4-5c2e-4c34-9615-29e169ec5f45 nodeName:}" failed. No retries permitted until 2025-11-26 22:55:46.416650674 +0000 UTC m=+1021.829344686 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift") pod "swift-proxy-6bd58cfcf7-gvt6r" (UID: "1931c8a4-5c2e-4c34-9615-29e169ec5f45") : configmap "swift-ring-files" not found Nov 26 22:55:44 crc kubenswrapper[5008]: I1126 22:55:44.747749 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:44 crc kubenswrapper[5008]: E1126 22:55:44.747956 5008 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 22:55:44 crc kubenswrapper[5008]: E1126 22:55:44.748000 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 26 22:55:44 crc kubenswrapper[5008]: E1126 22:55:44.748065 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift podName:b90153ab-a3cc-4cc8-9d59-5813b2e8d550 nodeName:}" failed. No retries permitted until 2025-11-26 22:55:52.748044305 +0000 UTC m=+1028.160738307 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift") pod "swift-storage-0" (UID: "b90153ab-a3cc-4cc8-9d59-5813b2e8d550") : configmap "swift-ring-files" not found Nov 26 22:55:46 crc kubenswrapper[5008]: I1126 22:55:46.499034 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:46 crc kubenswrapper[5008]: E1126 22:55:46.499213 5008 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 22:55:46 crc kubenswrapper[5008]: E1126 22:55:46.499396 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r: configmap "swift-ring-files" not found Nov 26 22:55:46 crc kubenswrapper[5008]: E1126 22:55:46.499454 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift podName:1931c8a4-5c2e-4c34-9615-29e169ec5f45 nodeName:}" failed. No retries permitted until 2025-11-26 22:55:54.499437181 +0000 UTC m=+1029.912131193 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift") pod "swift-proxy-6bd58cfcf7-gvt6r" (UID: "1931c8a4-5c2e-4c34-9615-29e169ec5f45") : configmap "swift-ring-files" not found Nov 26 22:55:47 crc kubenswrapper[5008]: I1126 22:55:47.396026 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" event={"ID":"c006b487-ea33-4b37-b3fa-2105efbf7717","Type":"ContainerStarted","Data":"7c705af9e7de1c37004073983654795f553d97be9bf404dbfbb5d8eea879f3e5"} Nov 26 22:55:47 crc kubenswrapper[5008]: I1126 22:55:47.399195 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-6dg96" event={"ID":"4b2dec9b-659a-40df-948c-3fd70d7e4d24","Type":"ContainerStarted","Data":"a6246052338d8815b96044044f2f048b43e46fa2d8c46015258ef646c1a9b151"} Nov 26 22:55:47 crc kubenswrapper[5008]: I1126 22:55:47.424567 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" podStartSLOduration=2.368850231 podStartE2EDuration="10.42454993s" podCreationTimestamp="2025-11-26 22:55:37 +0000 UTC" firstStartedPulling="2025-11-26 22:55:38.254877809 +0000 UTC m=+1013.667571811" lastFinishedPulling="2025-11-26 22:55:46.310577468 +0000 UTC m=+1021.723271510" observedRunningTime="2025-11-26 22:55:47.423063263 +0000 UTC m=+1022.835757265" watchObservedRunningTime="2025-11-26 22:55:47.42454993 +0000 UTC m=+1022.837243922" Nov 26 22:55:48 crc kubenswrapper[5008]: I1126 22:55:48.667197 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-index-6dg96" Nov 26 22:55:48 crc kubenswrapper[5008]: I1126 22:55:48.667379 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/glance-operator-index-6dg96" Nov 26 22:55:48 crc kubenswrapper[5008]: I1126 22:55:48.720572 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/glance-operator-index-6dg96" Nov 26 22:55:48 crc kubenswrapper[5008]: I1126 22:55:48.751200 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-6dg96" podStartSLOduration=3.55065564 podStartE2EDuration="10.751169983s" podCreationTimestamp="2025-11-26 22:55:38 +0000 UTC" firstStartedPulling="2025-11-26 22:55:39.084791472 +0000 UTC m=+1014.497485504" lastFinishedPulling="2025-11-26 22:55:46.285305805 +0000 UTC m=+1021.697999847" observedRunningTime="2025-11-26 22:55:47.450212414 +0000 UTC m=+1022.862906446" watchObservedRunningTime="2025-11-26 22:55:48.751169983 +0000 UTC m=+1024.163864025" Nov 26 22:55:52 crc kubenswrapper[5008]: I1126 22:55:52.795170 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:52 crc kubenswrapper[5008]: I1126 22:55:52.808573 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b90153ab-a3cc-4cc8-9d59-5813b2e8d550-etc-swift\") pod \"swift-storage-0\" (UID: \"b90153ab-a3cc-4cc8-9d59-5813b2e8d550\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:52 crc kubenswrapper[5008]: I1126 22:55:52.843701 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Nov 26 22:55:53 crc kubenswrapper[5008]: I1126 22:55:53.311567 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Nov 26 22:55:53 crc kubenswrapper[5008]: I1126 22:55:53.441501 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"434415f428d9adc0c71e39d91aeb43baea251d95165840245d6b5f3086842407"} Nov 26 22:55:53 crc kubenswrapper[5008]: I1126 22:55:53.443034 5008 generic.go:334] "Generic (PLEG): container finished" podID="c006b487-ea33-4b37-b3fa-2105efbf7717" containerID="7c705af9e7de1c37004073983654795f553d97be9bf404dbfbb5d8eea879f3e5" exitCode=0 Nov 26 22:55:53 crc kubenswrapper[5008]: I1126 22:55:53.443070 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" event={"ID":"c006b487-ea33-4b37-b3fa-2105efbf7717","Type":"ContainerDied","Data":"7c705af9e7de1c37004073983654795f553d97be9bf404dbfbb5d8eea879f3e5"} Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.524874 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.547219 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1931c8a4-5c2e-4c34-9615-29e169ec5f45-etc-swift\") pod \"swift-proxy-6bd58cfcf7-gvt6r\" (UID: \"1931c8a4-5c2e-4c34-9615-29e169ec5f45\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.566950 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.906039 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.930367 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c006b487-ea33-4b37-b3fa-2105efbf7717-swiftconf\") pod \"c006b487-ea33-4b37-b3fa-2105efbf7717\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.930408 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c006b487-ea33-4b37-b3fa-2105efbf7717-scripts\") pod \"c006b487-ea33-4b37-b3fa-2105efbf7717\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.930459 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c006b487-ea33-4b37-b3fa-2105efbf7717-ring-data-devices\") pod \"c006b487-ea33-4b37-b3fa-2105efbf7717\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.930485 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c006b487-ea33-4b37-b3fa-2105efbf7717-etc-swift\") pod \"c006b487-ea33-4b37-b3fa-2105efbf7717\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.930509 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l45fj\" (UniqueName: \"kubernetes.io/projected/c006b487-ea33-4b37-b3fa-2105efbf7717-kube-api-access-l45fj\") pod \"c006b487-ea33-4b37-b3fa-2105efbf7717\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.930531 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c006b487-ea33-4b37-b3fa-2105efbf7717-dispersionconf\") pod \"c006b487-ea33-4b37-b3fa-2105efbf7717\" (UID: \"c006b487-ea33-4b37-b3fa-2105efbf7717\") " Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.933285 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c006b487-ea33-4b37-b3fa-2105efbf7717-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c006b487-ea33-4b37-b3fa-2105efbf7717" (UID: "c006b487-ea33-4b37-b3fa-2105efbf7717"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.934833 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c006b487-ea33-4b37-b3fa-2105efbf7717-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c006b487-ea33-4b37-b3fa-2105efbf7717" (UID: "c006b487-ea33-4b37-b3fa-2105efbf7717"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.941790 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c006b487-ea33-4b37-b3fa-2105efbf7717-kube-api-access-l45fj" (OuterVolumeSpecName: "kube-api-access-l45fj") pod "c006b487-ea33-4b37-b3fa-2105efbf7717" (UID: "c006b487-ea33-4b37-b3fa-2105efbf7717"). InnerVolumeSpecName "kube-api-access-l45fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.947149 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c006b487-ea33-4b37-b3fa-2105efbf7717-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c006b487-ea33-4b37-b3fa-2105efbf7717" (UID: "c006b487-ea33-4b37-b3fa-2105efbf7717"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.967203 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c006b487-ea33-4b37-b3fa-2105efbf7717-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c006b487-ea33-4b37-b3fa-2105efbf7717" (UID: "c006b487-ea33-4b37-b3fa-2105efbf7717"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:55:54 crc kubenswrapper[5008]: I1126 22:55:54.970549 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c006b487-ea33-4b37-b3fa-2105efbf7717-scripts" (OuterVolumeSpecName: "scripts") pod "c006b487-ea33-4b37-b3fa-2105efbf7717" (UID: "c006b487-ea33-4b37-b3fa-2105efbf7717"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.033015 5008 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c006b487-ea33-4b37-b3fa-2105efbf7717-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.033411 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c006b487-ea33-4b37-b3fa-2105efbf7717-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.033434 5008 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c006b487-ea33-4b37-b3fa-2105efbf7717-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.033455 5008 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c006b487-ea33-4b37-b3fa-2105efbf7717-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.033473 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l45fj\" (UniqueName: \"kubernetes.io/projected/c006b487-ea33-4b37-b3fa-2105efbf7717-kube-api-access-l45fj\") on node \"crc\" DevicePath \"\"" Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.033491 5008 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c006b487-ea33-4b37-b3fa-2105efbf7717-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.249177 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r"] Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.481335 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" event={"ID":"1931c8a4-5c2e-4c34-9615-29e169ec5f45","Type":"ContainerStarted","Data":"1ae0ca9b5e3ee7779d0e6304951faf4eeadd911c809764e7e0293ae528131c06"} Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.481377 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" event={"ID":"1931c8a4-5c2e-4c34-9615-29e169ec5f45","Type":"ContainerStarted","Data":"4381a00e66a7fc7e0d7208c47f184986bf0fec79c73c2e7ccc5b3bdda4e6b8bd"} Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.504335 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"832a2eb207684c2b416478ea24943d16ff4aa1503a32cfe179dc4edcb2d0a664"} Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.504381 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"ed8de466f892ea0cfee055033279de5140d29dfd0a6702da9fc65c5b45101e11"} Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.504393 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"92b7bcbd85eb5e6d3c29ea121a2dc8cebedd733424aa44a52f5a9944d8de2d7a"} Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.504401 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"c8baeea910ca710db2b557d7fa43c62798226d34b3df7c24b67d0214bf64dd20"} Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.521224 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.537969 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-lk2l2" event={"ID":"c006b487-ea33-4b37-b3fa-2105efbf7717","Type":"ContainerDied","Data":"1093f923e65acdc6e0e8b16fdff5dd32ffe6c471658187c486e8b9cd1f7504b3"} Nov 26 22:55:55 crc kubenswrapper[5008]: I1126 22:55:55.538034 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1093f923e65acdc6e0e8b16fdff5dd32ffe6c471658187c486e8b9cd1f7504b3" Nov 26 22:55:56 crc kubenswrapper[5008]: I1126 22:55:56.552738 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" event={"ID":"1931c8a4-5c2e-4c34-9615-29e169ec5f45","Type":"ContainerStarted","Data":"5464dc1c39576805190ddcfb4619b42e01ef2be6d7df971eb07757214494fcc7"} Nov 26 22:55:56 crc kubenswrapper[5008]: I1126 22:55:56.553425 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:56 crc kubenswrapper[5008]: I1126 22:55:56.553456 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:55:56 crc kubenswrapper[5008]: I1126 22:55:56.589020 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" podStartSLOduration=18.588997107 podStartE2EDuration="18.588997107s" podCreationTimestamp="2025-11-26 22:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:55:56.581846733 +0000 UTC m=+1031.994540785" watchObservedRunningTime="2025-11-26 22:55:56.588997107 +0000 UTC m=+1032.001691129" Nov 26 22:55:57 crc kubenswrapper[5008]: I1126 22:55:57.576852 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"d7fa5d639c4e49e430aa9a17fd3ea007aa60f8689ae93604651a04d152c7c736"} Nov 26 22:55:57 crc kubenswrapper[5008]: I1126 22:55:57.577174 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"339f5543d7184e2f70f2b16d6ba8474db4b16e849983340fbba0c6cbfd577df4"} Nov 26 22:55:57 crc kubenswrapper[5008]: I1126 22:55:57.577186 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"a08a0f1998793663b706e7b7732bdbbe56c5b5fc079942e2b16d3f1736fd2e20"} Nov 26 22:55:58 crc kubenswrapper[5008]: I1126 22:55:58.593729 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"612fcf4490c51f4a4d26687d340e3f1930b3d66c7a05b14d00b777ee36e159b4"} Nov 26 22:55:58 crc kubenswrapper[5008]: I1126 22:55:58.711067 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-index-6dg96" Nov 26 22:55:59 crc kubenswrapper[5008]: I1126 22:55:59.634543 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"9e8fd8bf166acc475471cab05e0859b821516000afc581e292e1d4541bc85db7"} Nov 26 22:55:59 crc kubenswrapper[5008]: I1126 22:55:59.635026 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"6f9b1a670af389358659ec38361d1efd6dc4c99482a69c24d8beaed0d7d71163"} Nov 26 22:56:00 crc kubenswrapper[5008]: I1126 22:56:00.644531 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"0ad68101e351839f6f9e8fa8fb6eb3099740b53ced9574cba1577a5dead4dd09"} Nov 26 22:56:00 crc kubenswrapper[5008]: I1126 22:56:00.644573 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"44e99f540c9aed84896855ade467ece3888324223ab64f123f73b29973504b9d"} Nov 26 22:56:00 crc kubenswrapper[5008]: I1126 22:56:00.644582 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"e14cfa324567a8e7984d10dc951e28863081de33a445c25f20772dfeac109706"} Nov 26 22:56:00 crc kubenswrapper[5008]: I1126 22:56:00.644592 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"b0d4ce8c5298ce6039fdf920c22221cde53c017b7f7696a93536daad9090a1d1"} Nov 26 22:56:00 crc kubenswrapper[5008]: I1126 22:56:00.644603 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"b90153ab-a3cc-4cc8-9d59-5813b2e8d550","Type":"ContainerStarted","Data":"280e293dd97285f724d50cebf544213ebd9af975c5d9f3ee864d41a1510cc23a"} Nov 26 22:56:00 crc kubenswrapper[5008]: I1126 22:56:00.682127 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-storage-0" podStartSLOduration=19.858799583 podStartE2EDuration="25.682110526s" podCreationTimestamp="2025-11-26 22:55:35 +0000 UTC" firstStartedPulling="2025-11-26 22:55:53.331967271 +0000 UTC m=+1028.744661263" lastFinishedPulling="2025-11-26 22:55:59.155278204 +0000 UTC m=+1034.567972206" observedRunningTime="2025-11-26 22:56:00.676324774 +0000 UTC m=+1036.089018786" watchObservedRunningTime="2025-11-26 22:56:00.682110526 +0000 UTC m=+1036.094804528" Nov 26 22:56:04 crc kubenswrapper[5008]: I1126 22:56:04.573521 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:56:04 crc kubenswrapper[5008]: I1126 22:56:04.577784 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-gvt6r" Nov 26 22:56:07 crc kubenswrapper[5008]: I1126 22:56:07.793164 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb"] Nov 26 22:56:07 crc kubenswrapper[5008]: E1126 22:56:07.794169 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c006b487-ea33-4b37-b3fa-2105efbf7717" containerName="swift-ring-rebalance" Nov 26 22:56:07 crc kubenswrapper[5008]: I1126 22:56:07.794194 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c006b487-ea33-4b37-b3fa-2105efbf7717" containerName="swift-ring-rebalance" Nov 26 22:56:07 crc kubenswrapper[5008]: I1126 22:56:07.794419 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c006b487-ea33-4b37-b3fa-2105efbf7717" containerName="swift-ring-rebalance" Nov 26 22:56:07 crc kubenswrapper[5008]: I1126 22:56:07.796004 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" Nov 26 22:56:07 crc kubenswrapper[5008]: I1126 22:56:07.798274 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6kdt6" Nov 26 22:56:07 crc kubenswrapper[5008]: I1126 22:56:07.810809 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb"] Nov 26 22:56:07 crc kubenswrapper[5008]: I1126 22:56:07.941574 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-util\") pod \"5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb\" (UID: \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\") " pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" Nov 26 22:56:07 crc kubenswrapper[5008]: I1126 22:56:07.941633 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-bundle\") pod \"5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb\" (UID: \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\") " pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" Nov 26 22:56:07 crc kubenswrapper[5008]: I1126 22:56:07.941749 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkwbn\" (UniqueName: \"kubernetes.io/projected/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-kube-api-access-kkwbn\") pod \"5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb\" (UID: \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\") " pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" Nov 26 22:56:08 crc kubenswrapper[5008]: I1126 22:56:08.043673 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-util\") pod \"5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb\" (UID: \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\") " pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" Nov 26 22:56:08 crc kubenswrapper[5008]: I1126 22:56:08.043728 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-bundle\") pod \"5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb\" (UID: \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\") " pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" Nov 26 22:56:08 crc kubenswrapper[5008]: I1126 22:56:08.043813 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkwbn\" (UniqueName: \"kubernetes.io/projected/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-kube-api-access-kkwbn\") pod \"5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb\" (UID: \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\") " pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" Nov 26 22:56:08 crc kubenswrapper[5008]: I1126 22:56:08.044435 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-util\") pod \"5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb\" (UID: \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\") " pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" Nov 26 22:56:08 crc kubenswrapper[5008]: I1126 22:56:08.044455 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-bundle\") pod \"5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb\" (UID: \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\") " pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" Nov 26 22:56:08 crc kubenswrapper[5008]: I1126 22:56:08.071106 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkwbn\" (UniqueName: \"kubernetes.io/projected/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-kube-api-access-kkwbn\") pod \"5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb\" (UID: \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\") " pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" Nov 26 22:56:08 crc kubenswrapper[5008]: I1126 22:56:08.130742 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" Nov 26 22:56:08 crc kubenswrapper[5008]: I1126 22:56:08.657029 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb"] Nov 26 22:56:08 crc kubenswrapper[5008]: W1126 22:56:08.665693 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a6479b_5e2c_43c8_8d18_f4db3cccb14e.slice/crio-dd7089b43f3baa3cf81cc0f524e9db94da78e28d9191f01ba84ab4c0dab6a06d WatchSource:0}: Error finding container dd7089b43f3baa3cf81cc0f524e9db94da78e28d9191f01ba84ab4c0dab6a06d: Status 404 returned error can't find the container with id dd7089b43f3baa3cf81cc0f524e9db94da78e28d9191f01ba84ab4c0dab6a06d Nov 26 22:56:08 crc kubenswrapper[5008]: I1126 22:56:08.722934 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" event={"ID":"51a6479b-5e2c-43c8-8d18-f4db3cccb14e","Type":"ContainerStarted","Data":"dd7089b43f3baa3cf81cc0f524e9db94da78e28d9191f01ba84ab4c0dab6a06d"} Nov 26 22:56:09 crc kubenswrapper[5008]: I1126 22:56:09.730998 5008 generic.go:334] "Generic (PLEG): container finished" podID="51a6479b-5e2c-43c8-8d18-f4db3cccb14e" containerID="c53ea2fab6add6a88b7e67283759d19f55fa86660bf3a05300c5d7d8b4625808" exitCode=0 Nov 26 22:56:09 crc kubenswrapper[5008]: I1126 22:56:09.731054 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" event={"ID":"51a6479b-5e2c-43c8-8d18-f4db3cccb14e","Type":"ContainerDied","Data":"c53ea2fab6add6a88b7e67283759d19f55fa86660bf3a05300c5d7d8b4625808"} Nov 26 22:56:10 crc kubenswrapper[5008]: I1126 22:56:10.745464 5008 generic.go:334] "Generic (PLEG): container finished" podID="51a6479b-5e2c-43c8-8d18-f4db3cccb14e" containerID="94d68f7a7afc46bdaf9de7527fb56604687f7aaca256bcd3bf7ca56f143b272f" exitCode=0 Nov 26 22:56:10 crc kubenswrapper[5008]: I1126 22:56:10.745602 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" event={"ID":"51a6479b-5e2c-43c8-8d18-f4db3cccb14e","Type":"ContainerDied","Data":"94d68f7a7afc46bdaf9de7527fb56604687f7aaca256bcd3bf7ca56f143b272f"} Nov 26 22:56:11 crc kubenswrapper[5008]: I1126 22:56:11.767595 5008 generic.go:334] "Generic (PLEG): container finished" podID="51a6479b-5e2c-43c8-8d18-f4db3cccb14e" containerID="b9a6338f21334a219319ff2d8721a4b98204e94b03b446b33e500265375ebff6" exitCode=0 Nov 26 22:56:11 crc kubenswrapper[5008]: I1126 22:56:11.767657 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" event={"ID":"51a6479b-5e2c-43c8-8d18-f4db3cccb14e","Type":"ContainerDied","Data":"b9a6338f21334a219319ff2d8721a4b98204e94b03b446b33e500265375ebff6"} Nov 26 22:56:13 crc kubenswrapper[5008]: I1126 22:56:13.088592 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" Nov 26 22:56:13 crc kubenswrapper[5008]: I1126 22:56:13.225933 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-util\") pod \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\" (UID: \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\") " Nov 26 22:56:13 crc kubenswrapper[5008]: I1126 22:56:13.226026 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkwbn\" (UniqueName: \"kubernetes.io/projected/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-kube-api-access-kkwbn\") pod \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\" (UID: \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\") " Nov 26 22:56:13 crc kubenswrapper[5008]: I1126 22:56:13.227050 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-bundle\") pod \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\" (UID: \"51a6479b-5e2c-43c8-8d18-f4db3cccb14e\") " Nov 26 22:56:13 crc kubenswrapper[5008]: I1126 22:56:13.227789 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-bundle" (OuterVolumeSpecName: "bundle") pod "51a6479b-5e2c-43c8-8d18-f4db3cccb14e" (UID: "51a6479b-5e2c-43c8-8d18-f4db3cccb14e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:56:13 crc kubenswrapper[5008]: I1126 22:56:13.238195 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-kube-api-access-kkwbn" (OuterVolumeSpecName: "kube-api-access-kkwbn") pod "51a6479b-5e2c-43c8-8d18-f4db3cccb14e" (UID: "51a6479b-5e2c-43c8-8d18-f4db3cccb14e"). InnerVolumeSpecName "kube-api-access-kkwbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:56:13 crc kubenswrapper[5008]: I1126 22:56:13.244622 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-util" (OuterVolumeSpecName: "util") pod "51a6479b-5e2c-43c8-8d18-f4db3cccb14e" (UID: "51a6479b-5e2c-43c8-8d18-f4db3cccb14e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:56:13 crc kubenswrapper[5008]: I1126 22:56:13.328922 5008 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:56:13 crc kubenswrapper[5008]: I1126 22:56:13.328955 5008 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-util\") on node \"crc\" DevicePath \"\"" Nov 26 22:56:13 crc kubenswrapper[5008]: I1126 22:56:13.328978 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkwbn\" (UniqueName: \"kubernetes.io/projected/51a6479b-5e2c-43c8-8d18-f4db3cccb14e-kube-api-access-kkwbn\") on node \"crc\" DevicePath \"\"" Nov 26 22:56:13 crc kubenswrapper[5008]: I1126 22:56:13.791577 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" event={"ID":"51a6479b-5e2c-43c8-8d18-f4db3cccb14e","Type":"ContainerDied","Data":"dd7089b43f3baa3cf81cc0f524e9db94da78e28d9191f01ba84ab4c0dab6a06d"} Nov 26 22:56:13 crc kubenswrapper[5008]: I1126 22:56:13.791643 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd7089b43f3baa3cf81cc0f524e9db94da78e28d9191f01ba84ab4c0dab6a06d" Nov 26 22:56:13 crc kubenswrapper[5008]: I1126 22:56:13.791685 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5998248fafa942d3ef13f3e4682491a7dd1bbf8acb65109c0cd0c6027ez8zpb" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.374754 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk"] Nov 26 22:56:24 crc kubenswrapper[5008]: E1126 22:56:24.377352 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a6479b-5e2c-43c8-8d18-f4db3cccb14e" containerName="util" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.377530 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a6479b-5e2c-43c8-8d18-f4db3cccb14e" containerName="util" Nov 26 22:56:24 crc kubenswrapper[5008]: E1126 22:56:24.378782 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a6479b-5e2c-43c8-8d18-f4db3cccb14e" containerName="extract" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.378991 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a6479b-5e2c-43c8-8d18-f4db3cccb14e" containerName="extract" Nov 26 22:56:24 crc kubenswrapper[5008]: E1126 22:56:24.379167 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a6479b-5e2c-43c8-8d18-f4db3cccb14e" containerName="pull" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.379305 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a6479b-5e2c-43c8-8d18-f4db3cccb14e" containerName="pull" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.379749 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a6479b-5e2c-43c8-8d18-f4db3cccb14e" containerName="extract" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.380394 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.384011 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-pgdvf" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.384627 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.392736 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk"] Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.524369 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95e78ec8-1a94-47ca-b471-10ba505c5583-apiservice-cert\") pod \"glance-operator-controller-manager-76d465bf76-r74xk\" (UID: \"95e78ec8-1a94-47ca-b471-10ba505c5583\") " pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.524424 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95e78ec8-1a94-47ca-b471-10ba505c5583-webhook-cert\") pod \"glance-operator-controller-manager-76d465bf76-r74xk\" (UID: \"95e78ec8-1a94-47ca-b471-10ba505c5583\") " pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.524462 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nzt4\" (UniqueName: \"kubernetes.io/projected/95e78ec8-1a94-47ca-b471-10ba505c5583-kube-api-access-6nzt4\") pod \"glance-operator-controller-manager-76d465bf76-r74xk\" (UID: \"95e78ec8-1a94-47ca-b471-10ba505c5583\") " pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.626151 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95e78ec8-1a94-47ca-b471-10ba505c5583-webhook-cert\") pod \"glance-operator-controller-manager-76d465bf76-r74xk\" (UID: \"95e78ec8-1a94-47ca-b471-10ba505c5583\") " pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.626220 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nzt4\" (UniqueName: \"kubernetes.io/projected/95e78ec8-1a94-47ca-b471-10ba505c5583-kube-api-access-6nzt4\") pod \"glance-operator-controller-manager-76d465bf76-r74xk\" (UID: \"95e78ec8-1a94-47ca-b471-10ba505c5583\") " pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.626299 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95e78ec8-1a94-47ca-b471-10ba505c5583-apiservice-cert\") pod \"glance-operator-controller-manager-76d465bf76-r74xk\" (UID: \"95e78ec8-1a94-47ca-b471-10ba505c5583\") " pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.634561 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95e78ec8-1a94-47ca-b471-10ba505c5583-apiservice-cert\") pod \"glance-operator-controller-manager-76d465bf76-r74xk\" (UID: \"95e78ec8-1a94-47ca-b471-10ba505c5583\") " pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.640993 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95e78ec8-1a94-47ca-b471-10ba505c5583-webhook-cert\") pod \"glance-operator-controller-manager-76d465bf76-r74xk\" (UID: \"95e78ec8-1a94-47ca-b471-10ba505c5583\") " pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.646837 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nzt4\" (UniqueName: \"kubernetes.io/projected/95e78ec8-1a94-47ca-b471-10ba505c5583-kube-api-access-6nzt4\") pod \"glance-operator-controller-manager-76d465bf76-r74xk\" (UID: \"95e78ec8-1a94-47ca-b471-10ba505c5583\") " pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 22:56:24 crc kubenswrapper[5008]: I1126 22:56:24.706140 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 22:56:25 crc kubenswrapper[5008]: I1126 22:56:25.152277 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk"] Nov 26 22:56:25 crc kubenswrapper[5008]: I1126 22:56:25.576674 5008 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podc006b487-ea33-4b37-b3fa-2105efbf7717"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podc006b487-ea33-4b37-b3fa-2105efbf7717] : Timed out while waiting for systemd to remove kubepods-besteffort-podc006b487_ea33_4b37_b3fa_2105efbf7717.slice" Nov 26 22:56:25 crc kubenswrapper[5008]: I1126 22:56:25.918539 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" event={"ID":"95e78ec8-1a94-47ca-b471-10ba505c5583","Type":"ContainerStarted","Data":"00eddd7b47aa97a7d58c3bf37f518302cadd4f5775a3aa86ac3b8ef0885aacc2"} Nov 26 22:56:26 crc kubenswrapper[5008]: I1126 22:56:26.931365 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" event={"ID":"95e78ec8-1a94-47ca-b471-10ba505c5583","Type":"ContainerStarted","Data":"0a9db12963cb3ca17ff0916f648fc20398d31f12881d38d434dcd29997c52725"} Nov 26 22:56:26 crc kubenswrapper[5008]: I1126 22:56:26.931627 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 22:56:26 crc kubenswrapper[5008]: I1126 22:56:26.948837 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" podStartSLOduration=1.5703008440000001 podStartE2EDuration="2.948820539s" podCreationTimestamp="2025-11-26 22:56:24 +0000 UTC" firstStartedPulling="2025-11-26 22:56:25.162369048 +0000 UTC m=+1060.575063060" lastFinishedPulling="2025-11-26 22:56:26.540888753 +0000 UTC m=+1061.953582755" observedRunningTime="2025-11-26 22:56:26.945763533 +0000 UTC m=+1062.358457535" watchObservedRunningTime="2025-11-26 22:56:26.948820539 +0000 UTC m=+1062.361514541" Nov 26 22:56:34 crc kubenswrapper[5008]: I1126 22:56:34.713352 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.397181 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.398185 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.400847 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-5g58j" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.400975 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.404792 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.408055 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.419632 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-5a2c-account-create-update-w89b5"] Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.420629 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5a2c-account-create-update-w89b5" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.422863 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.423914 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-rdkjn"] Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.424671 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-rdkjn" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.428010 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-5a2c-account-create-update-w89b5"] Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.464557 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.483539 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-rdkjn"] Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.527225 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/04c788ce-01d7-4808-a97b-d191103496a4-openstack-config\") pod \"openstackclient\" (UID: \"04c788ce-01d7-4808-a97b-d191103496a4\") " pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.527821 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e09b02-d156-4def-a546-5cfe04eda5e2-operator-scripts\") pod \"glance-db-create-rdkjn\" (UID: \"65e09b02-d156-4def-a546-5cfe04eda5e2\") " pod="glance-kuttl-tests/glance-db-create-rdkjn" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.527988 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnmxm\" (UniqueName: \"kubernetes.io/projected/04c788ce-01d7-4808-a97b-d191103496a4-kube-api-access-xnmxm\") pod \"openstackclient\" (UID: \"04c788ce-01d7-4808-a97b-d191103496a4\") " pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.528181 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ae4ab1-219f-4500-b1ea-0db090c7a3ac-operator-scripts\") pod \"glance-5a2c-account-create-update-w89b5\" (UID: \"06ae4ab1-219f-4500-b1ea-0db090c7a3ac\") " pod="glance-kuttl-tests/glance-5a2c-account-create-update-w89b5" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.528321 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/04c788ce-01d7-4808-a97b-d191103496a4-openstack-scripts\") pod \"openstackclient\" (UID: \"04c788ce-01d7-4808-a97b-d191103496a4\") " pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.528458 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/04c788ce-01d7-4808-a97b-d191103496a4-openstack-config-secret\") pod \"openstackclient\" (UID: \"04c788ce-01d7-4808-a97b-d191103496a4\") " pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.528532 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djz54\" (UniqueName: \"kubernetes.io/projected/65e09b02-d156-4def-a546-5cfe04eda5e2-kube-api-access-djz54\") pod \"glance-db-create-rdkjn\" (UID: \"65e09b02-d156-4def-a546-5cfe04eda5e2\") " pod="glance-kuttl-tests/glance-db-create-rdkjn" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.528605 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fq77\" (UniqueName: \"kubernetes.io/projected/06ae4ab1-219f-4500-b1ea-0db090c7a3ac-kube-api-access-6fq77\") pod \"glance-5a2c-account-create-update-w89b5\" (UID: \"06ae4ab1-219f-4500-b1ea-0db090c7a3ac\") " pod="glance-kuttl-tests/glance-5a2c-account-create-update-w89b5" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.629142 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ae4ab1-219f-4500-b1ea-0db090c7a3ac-operator-scripts\") pod \"glance-5a2c-account-create-update-w89b5\" (UID: \"06ae4ab1-219f-4500-b1ea-0db090c7a3ac\") " pod="glance-kuttl-tests/glance-5a2c-account-create-update-w89b5" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.629203 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/04c788ce-01d7-4808-a97b-d191103496a4-openstack-scripts\") pod \"openstackclient\" (UID: \"04c788ce-01d7-4808-a97b-d191103496a4\") " pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.629250 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/04c788ce-01d7-4808-a97b-d191103496a4-openstack-config-secret\") pod \"openstackclient\" (UID: \"04c788ce-01d7-4808-a97b-d191103496a4\") " pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.629272 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djz54\" (UniqueName: \"kubernetes.io/projected/65e09b02-d156-4def-a546-5cfe04eda5e2-kube-api-access-djz54\") pod \"glance-db-create-rdkjn\" (UID: \"65e09b02-d156-4def-a546-5cfe04eda5e2\") " pod="glance-kuttl-tests/glance-db-create-rdkjn" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.629311 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fq77\" (UniqueName: \"kubernetes.io/projected/06ae4ab1-219f-4500-b1ea-0db090c7a3ac-kube-api-access-6fq77\") pod \"glance-5a2c-account-create-update-w89b5\" (UID: \"06ae4ab1-219f-4500-b1ea-0db090c7a3ac\") " pod="glance-kuttl-tests/glance-5a2c-account-create-update-w89b5" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.629379 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/04c788ce-01d7-4808-a97b-d191103496a4-openstack-config\") pod \"openstackclient\" (UID: \"04c788ce-01d7-4808-a97b-d191103496a4\") " pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.629402 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e09b02-d156-4def-a546-5cfe04eda5e2-operator-scripts\") pod \"glance-db-create-rdkjn\" (UID: \"65e09b02-d156-4def-a546-5cfe04eda5e2\") " pod="glance-kuttl-tests/glance-db-create-rdkjn" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.629430 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnmxm\" (UniqueName: \"kubernetes.io/projected/04c788ce-01d7-4808-a97b-d191103496a4-kube-api-access-xnmxm\") pod \"openstackclient\" (UID: \"04c788ce-01d7-4808-a97b-d191103496a4\") " pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.630704 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/04c788ce-01d7-4808-a97b-d191103496a4-openstack-scripts\") pod \"openstackclient\" (UID: \"04c788ce-01d7-4808-a97b-d191103496a4\") " pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.630726 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/04c788ce-01d7-4808-a97b-d191103496a4-openstack-config\") pod \"openstackclient\" (UID: \"04c788ce-01d7-4808-a97b-d191103496a4\") " pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.631171 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e09b02-d156-4def-a546-5cfe04eda5e2-operator-scripts\") pod \"glance-db-create-rdkjn\" (UID: \"65e09b02-d156-4def-a546-5cfe04eda5e2\") " pod="glance-kuttl-tests/glance-db-create-rdkjn" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.631344 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ae4ab1-219f-4500-b1ea-0db090c7a3ac-operator-scripts\") pod \"glance-5a2c-account-create-update-w89b5\" (UID: \"06ae4ab1-219f-4500-b1ea-0db090c7a3ac\") " pod="glance-kuttl-tests/glance-5a2c-account-create-update-w89b5" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.645054 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/04c788ce-01d7-4808-a97b-d191103496a4-openstack-config-secret\") pod \"openstackclient\" (UID: \"04c788ce-01d7-4808-a97b-d191103496a4\") " pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.645484 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djz54\" (UniqueName: \"kubernetes.io/projected/65e09b02-d156-4def-a546-5cfe04eda5e2-kube-api-access-djz54\") pod \"glance-db-create-rdkjn\" (UID: \"65e09b02-d156-4def-a546-5cfe04eda5e2\") " pod="glance-kuttl-tests/glance-db-create-rdkjn" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.653057 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnmxm\" (UniqueName: \"kubernetes.io/projected/04c788ce-01d7-4808-a97b-d191103496a4-kube-api-access-xnmxm\") pod \"openstackclient\" (UID: \"04c788ce-01d7-4808-a97b-d191103496a4\") " pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.656198 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fq77\" (UniqueName: \"kubernetes.io/projected/06ae4ab1-219f-4500-b1ea-0db090c7a3ac-kube-api-access-6fq77\") pod \"glance-5a2c-account-create-update-w89b5\" (UID: \"06ae4ab1-219f-4500-b1ea-0db090c7a3ac\") " pod="glance-kuttl-tests/glance-5a2c-account-create-update-w89b5" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.711694 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.733769 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5a2c-account-create-update-w89b5" Nov 26 22:56:36 crc kubenswrapper[5008]: I1126 22:56:36.741609 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-rdkjn" Nov 26 22:56:37 crc kubenswrapper[5008]: I1126 22:56:37.156361 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-5a2c-account-create-update-w89b5"] Nov 26 22:56:37 crc kubenswrapper[5008]: W1126 22:56:37.161738 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06ae4ab1_219f_4500_b1ea_0db090c7a3ac.slice/crio-8e6267d37e8473d87b3f1a5832eadf343fd34c589b92a9a34c0e1dec94b070a6 WatchSource:0}: Error finding container 8e6267d37e8473d87b3f1a5832eadf343fd34c589b92a9a34c0e1dec94b070a6: Status 404 returned error can't find the container with id 8e6267d37e8473d87b3f1a5832eadf343fd34c589b92a9a34c0e1dec94b070a6 Nov 26 22:56:37 crc kubenswrapper[5008]: I1126 22:56:37.229130 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 26 22:56:37 crc kubenswrapper[5008]: I1126 22:56:37.233797 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-rdkjn"] Nov 26 22:56:37 crc kubenswrapper[5008]: W1126 22:56:37.242423 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65e09b02_d156_4def_a546_5cfe04eda5e2.slice/crio-679ec52a834896a427d2b814e16f29113add727832013a0f1ee037f3840bcca0 WatchSource:0}: Error finding container 679ec52a834896a427d2b814e16f29113add727832013a0f1ee037f3840bcca0: Status 404 returned error can't find the container with id 679ec52a834896a427d2b814e16f29113add727832013a0f1ee037f3840bcca0 Nov 26 22:56:37 crc kubenswrapper[5008]: W1126 22:56:37.243074 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04c788ce_01d7_4808_a97b_d191103496a4.slice/crio-deab1a80cebdd72da10411d02b6aea133e1dee5127d31040e9c115c961250405 WatchSource:0}: Error finding container deab1a80cebdd72da10411d02b6aea133e1dee5127d31040e9c115c961250405: Status 404 returned error can't find the container with id deab1a80cebdd72da10411d02b6aea133e1dee5127d31040e9c115c961250405 Nov 26 22:56:38 crc kubenswrapper[5008]: I1126 22:56:38.047784 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"04c788ce-01d7-4808-a97b-d191103496a4","Type":"ContainerStarted","Data":"deab1a80cebdd72da10411d02b6aea133e1dee5127d31040e9c115c961250405"} Nov 26 22:56:38 crc kubenswrapper[5008]: I1126 22:56:38.051191 5008 generic.go:334] "Generic (PLEG): container finished" podID="06ae4ab1-219f-4500-b1ea-0db090c7a3ac" containerID="36f217132e067758614f2904e77682580bca0cacc4d32c857da8ec7c2f264f16" exitCode=0 Nov 26 22:56:38 crc kubenswrapper[5008]: I1126 22:56:38.051223 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5a2c-account-create-update-w89b5" event={"ID":"06ae4ab1-219f-4500-b1ea-0db090c7a3ac","Type":"ContainerDied","Data":"36f217132e067758614f2904e77682580bca0cacc4d32c857da8ec7c2f264f16"} Nov 26 22:56:38 crc kubenswrapper[5008]: I1126 22:56:38.051247 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5a2c-account-create-update-w89b5" event={"ID":"06ae4ab1-219f-4500-b1ea-0db090c7a3ac","Type":"ContainerStarted","Data":"8e6267d37e8473d87b3f1a5832eadf343fd34c589b92a9a34c0e1dec94b070a6"} Nov 26 22:56:38 crc kubenswrapper[5008]: I1126 22:56:38.052735 5008 generic.go:334] "Generic (PLEG): container finished" podID="65e09b02-d156-4def-a546-5cfe04eda5e2" containerID="438c620210d1abe035f72ec94d1312307d1298390e190607a64c3946021e9f0d" exitCode=0 Nov 26 22:56:38 crc kubenswrapper[5008]: I1126 22:56:38.052759 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-rdkjn" event={"ID":"65e09b02-d156-4def-a546-5cfe04eda5e2","Type":"ContainerDied","Data":"438c620210d1abe035f72ec94d1312307d1298390e190607a64c3946021e9f0d"} Nov 26 22:56:38 crc kubenswrapper[5008]: I1126 22:56:38.052773 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-rdkjn" event={"ID":"65e09b02-d156-4def-a546-5cfe04eda5e2","Type":"ContainerStarted","Data":"679ec52a834896a427d2b814e16f29113add727832013a0f1ee037f3840bcca0"} Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.392401 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-rdkjn" Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.509651 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5a2c-account-create-update-w89b5" Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.572565 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e09b02-d156-4def-a546-5cfe04eda5e2-operator-scripts\") pod \"65e09b02-d156-4def-a546-5cfe04eda5e2\" (UID: \"65e09b02-d156-4def-a546-5cfe04eda5e2\") " Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.572715 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djz54\" (UniqueName: \"kubernetes.io/projected/65e09b02-d156-4def-a546-5cfe04eda5e2-kube-api-access-djz54\") pod \"65e09b02-d156-4def-a546-5cfe04eda5e2\" (UID: \"65e09b02-d156-4def-a546-5cfe04eda5e2\") " Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.573568 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e09b02-d156-4def-a546-5cfe04eda5e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65e09b02-d156-4def-a546-5cfe04eda5e2" (UID: "65e09b02-d156-4def-a546-5cfe04eda5e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.584249 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e09b02-d156-4def-a546-5cfe04eda5e2-kube-api-access-djz54" (OuterVolumeSpecName: "kube-api-access-djz54") pod "65e09b02-d156-4def-a546-5cfe04eda5e2" (UID: "65e09b02-d156-4def-a546-5cfe04eda5e2"). InnerVolumeSpecName "kube-api-access-djz54". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.673588 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ae4ab1-219f-4500-b1ea-0db090c7a3ac-operator-scripts\") pod \"06ae4ab1-219f-4500-b1ea-0db090c7a3ac\" (UID: \"06ae4ab1-219f-4500-b1ea-0db090c7a3ac\") " Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.673636 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fq77\" (UniqueName: \"kubernetes.io/projected/06ae4ab1-219f-4500-b1ea-0db090c7a3ac-kube-api-access-6fq77\") pod \"06ae4ab1-219f-4500-b1ea-0db090c7a3ac\" (UID: \"06ae4ab1-219f-4500-b1ea-0db090c7a3ac\") " Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.674037 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e09b02-d156-4def-a546-5cfe04eda5e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.674053 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djz54\" (UniqueName: \"kubernetes.io/projected/65e09b02-d156-4def-a546-5cfe04eda5e2-kube-api-access-djz54\") on node \"crc\" DevicePath \"\"" Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.674744 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06ae4ab1-219f-4500-b1ea-0db090c7a3ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06ae4ab1-219f-4500-b1ea-0db090c7a3ac" (UID: "06ae4ab1-219f-4500-b1ea-0db090c7a3ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.678238 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06ae4ab1-219f-4500-b1ea-0db090c7a3ac-kube-api-access-6fq77" (OuterVolumeSpecName: "kube-api-access-6fq77") pod "06ae4ab1-219f-4500-b1ea-0db090c7a3ac" (UID: "06ae4ab1-219f-4500-b1ea-0db090c7a3ac"). InnerVolumeSpecName "kube-api-access-6fq77". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.775868 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ae4ab1-219f-4500-b1ea-0db090c7a3ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:56:39 crc kubenswrapper[5008]: I1126 22:56:39.775936 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fq77\" (UniqueName: \"kubernetes.io/projected/06ae4ab1-219f-4500-b1ea-0db090c7a3ac-kube-api-access-6fq77\") on node \"crc\" DevicePath \"\"" Nov 26 22:56:40 crc kubenswrapper[5008]: I1126 22:56:40.073794 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5a2c-account-create-update-w89b5" event={"ID":"06ae4ab1-219f-4500-b1ea-0db090c7a3ac","Type":"ContainerDied","Data":"8e6267d37e8473d87b3f1a5832eadf343fd34c589b92a9a34c0e1dec94b070a6"} Nov 26 22:56:40 crc kubenswrapper[5008]: I1126 22:56:40.073838 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5a2c-account-create-update-w89b5" Nov 26 22:56:40 crc kubenswrapper[5008]: I1126 22:56:40.073853 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e6267d37e8473d87b3f1a5832eadf343fd34c589b92a9a34c0e1dec94b070a6" Nov 26 22:56:40 crc kubenswrapper[5008]: I1126 22:56:40.084507 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-rdkjn" event={"ID":"65e09b02-d156-4def-a546-5cfe04eda5e2","Type":"ContainerDied","Data":"679ec52a834896a427d2b814e16f29113add727832013a0f1ee037f3840bcca0"} Nov 26 22:56:40 crc kubenswrapper[5008]: I1126 22:56:40.084571 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="679ec52a834896a427d2b814e16f29113add727832013a0f1ee037f3840bcca0" Nov 26 22:56:40 crc kubenswrapper[5008]: I1126 22:56:40.084583 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-rdkjn" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.528700 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-dvtcp"] Nov 26 22:56:41 crc kubenswrapper[5008]: E1126 22:56:41.529206 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06ae4ab1-219f-4500-b1ea-0db090c7a3ac" containerName="mariadb-account-create-update" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.529219 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="06ae4ab1-219f-4500-b1ea-0db090c7a3ac" containerName="mariadb-account-create-update" Nov 26 22:56:41 crc kubenswrapper[5008]: E1126 22:56:41.529237 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e09b02-d156-4def-a546-5cfe04eda5e2" containerName="mariadb-database-create" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.529243 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e09b02-d156-4def-a546-5cfe04eda5e2" containerName="mariadb-database-create" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.529376 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e09b02-d156-4def-a546-5cfe04eda5e2" containerName="mariadb-database-create" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.529389 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="06ae4ab1-219f-4500-b1ea-0db090c7a3ac" containerName="mariadb-account-create-update" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.529813 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dvtcp" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.534525 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.534745 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-p4jpk" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.541165 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dvtcp"] Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.707748 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e862e5b1-269c-4e52-8a5c-360d79ba776e-db-sync-config-data\") pod \"glance-db-sync-dvtcp\" (UID: \"e862e5b1-269c-4e52-8a5c-360d79ba776e\") " pod="glance-kuttl-tests/glance-db-sync-dvtcp" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.707850 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb85x\" (UniqueName: \"kubernetes.io/projected/e862e5b1-269c-4e52-8a5c-360d79ba776e-kube-api-access-vb85x\") pod \"glance-db-sync-dvtcp\" (UID: \"e862e5b1-269c-4e52-8a5c-360d79ba776e\") " pod="glance-kuttl-tests/glance-db-sync-dvtcp" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.707871 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e862e5b1-269c-4e52-8a5c-360d79ba776e-config-data\") pod \"glance-db-sync-dvtcp\" (UID: \"e862e5b1-269c-4e52-8a5c-360d79ba776e\") " pod="glance-kuttl-tests/glance-db-sync-dvtcp" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.809788 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e862e5b1-269c-4e52-8a5c-360d79ba776e-config-data\") pod \"glance-db-sync-dvtcp\" (UID: \"e862e5b1-269c-4e52-8a5c-360d79ba776e\") " pod="glance-kuttl-tests/glance-db-sync-dvtcp" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.809912 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e862e5b1-269c-4e52-8a5c-360d79ba776e-db-sync-config-data\") pod \"glance-db-sync-dvtcp\" (UID: \"e862e5b1-269c-4e52-8a5c-360d79ba776e\") " pod="glance-kuttl-tests/glance-db-sync-dvtcp" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.810616 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb85x\" (UniqueName: \"kubernetes.io/projected/e862e5b1-269c-4e52-8a5c-360d79ba776e-kube-api-access-vb85x\") pod \"glance-db-sync-dvtcp\" (UID: \"e862e5b1-269c-4e52-8a5c-360d79ba776e\") " pod="glance-kuttl-tests/glance-db-sync-dvtcp" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.814564 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e862e5b1-269c-4e52-8a5c-360d79ba776e-db-sync-config-data\") pod \"glance-db-sync-dvtcp\" (UID: \"e862e5b1-269c-4e52-8a5c-360d79ba776e\") " pod="glance-kuttl-tests/glance-db-sync-dvtcp" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.828380 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e862e5b1-269c-4e52-8a5c-360d79ba776e-config-data\") pod \"glance-db-sync-dvtcp\" (UID: \"e862e5b1-269c-4e52-8a5c-360d79ba776e\") " pod="glance-kuttl-tests/glance-db-sync-dvtcp" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.832556 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb85x\" (UniqueName: \"kubernetes.io/projected/e862e5b1-269c-4e52-8a5c-360d79ba776e-kube-api-access-vb85x\") pod \"glance-db-sync-dvtcp\" (UID: \"e862e5b1-269c-4e52-8a5c-360d79ba776e\") " pod="glance-kuttl-tests/glance-db-sync-dvtcp" Nov 26 22:56:41 crc kubenswrapper[5008]: I1126 22:56:41.844961 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dvtcp" Nov 26 22:56:42 crc kubenswrapper[5008]: I1126 22:56:42.069992 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dvtcp"] Nov 26 22:56:42 crc kubenswrapper[5008]: I1126 22:56:42.103544 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dvtcp" event={"ID":"e862e5b1-269c-4e52-8a5c-360d79ba776e","Type":"ContainerStarted","Data":"5153e4b18cb7f7821f7752c41d925d42f8ebbf059b3a3207ef37e16172ff22c0"} Nov 26 22:56:48 crc kubenswrapper[5008]: I1126 22:56:48.151126 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"04c788ce-01d7-4808-a97b-d191103496a4","Type":"ContainerStarted","Data":"78ab9b5e88efeb4f2f00e2f7c26bfc8222c42ded88e1f4018baf4da1e9b7aee4"} Nov 26 22:56:48 crc kubenswrapper[5008]: I1126 22:56:48.174920 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=1.923318222 podStartE2EDuration="12.174898058s" podCreationTimestamp="2025-11-26 22:56:36 +0000 UTC" firstStartedPulling="2025-11-26 22:56:37.244802232 +0000 UTC m=+1072.657496234" lastFinishedPulling="2025-11-26 22:56:47.496382028 +0000 UTC m=+1082.909076070" observedRunningTime="2025-11-26 22:56:48.172500113 +0000 UTC m=+1083.585194125" watchObservedRunningTime="2025-11-26 22:56:48.174898058 +0000 UTC m=+1083.587592060" Nov 26 22:56:59 crc kubenswrapper[5008]: I1126 22:56:59.281451 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:56:59 crc kubenswrapper[5008]: I1126 22:56:59.282164 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:57:00 crc kubenswrapper[5008]: I1126 22:57:00.271045 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dvtcp" event={"ID":"e862e5b1-269c-4e52-8a5c-360d79ba776e","Type":"ContainerStarted","Data":"4f44baf2490363918a126873442f5105a7450a2639ae2508996eeecd63a74d92"} Nov 26 22:57:06 crc kubenswrapper[5008]: I1126 22:57:06.328616 5008 generic.go:334] "Generic (PLEG): container finished" podID="e862e5b1-269c-4e52-8a5c-360d79ba776e" containerID="4f44baf2490363918a126873442f5105a7450a2639ae2508996eeecd63a74d92" exitCode=0 Nov 26 22:57:06 crc kubenswrapper[5008]: I1126 22:57:06.328762 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dvtcp" event={"ID":"e862e5b1-269c-4e52-8a5c-360d79ba776e","Type":"ContainerDied","Data":"4f44baf2490363918a126873442f5105a7450a2639ae2508996eeecd63a74d92"} Nov 26 22:57:07 crc kubenswrapper[5008]: I1126 22:57:07.983616 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dvtcp" Nov 26 22:57:08 crc kubenswrapper[5008]: I1126 22:57:08.061265 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e862e5b1-269c-4e52-8a5c-360d79ba776e-db-sync-config-data\") pod \"e862e5b1-269c-4e52-8a5c-360d79ba776e\" (UID: \"e862e5b1-269c-4e52-8a5c-360d79ba776e\") " Nov 26 22:57:08 crc kubenswrapper[5008]: I1126 22:57:08.061349 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb85x\" (UniqueName: \"kubernetes.io/projected/e862e5b1-269c-4e52-8a5c-360d79ba776e-kube-api-access-vb85x\") pod \"e862e5b1-269c-4e52-8a5c-360d79ba776e\" (UID: \"e862e5b1-269c-4e52-8a5c-360d79ba776e\") " Nov 26 22:57:08 crc kubenswrapper[5008]: I1126 22:57:08.061483 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e862e5b1-269c-4e52-8a5c-360d79ba776e-config-data\") pod \"e862e5b1-269c-4e52-8a5c-360d79ba776e\" (UID: \"e862e5b1-269c-4e52-8a5c-360d79ba776e\") " Nov 26 22:57:08 crc kubenswrapper[5008]: I1126 22:57:08.066573 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e862e5b1-269c-4e52-8a5c-360d79ba776e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e862e5b1-269c-4e52-8a5c-360d79ba776e" (UID: "e862e5b1-269c-4e52-8a5c-360d79ba776e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:57:08 crc kubenswrapper[5008]: I1126 22:57:08.067474 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e862e5b1-269c-4e52-8a5c-360d79ba776e-kube-api-access-vb85x" (OuterVolumeSpecName: "kube-api-access-vb85x") pod "e862e5b1-269c-4e52-8a5c-360d79ba776e" (UID: "e862e5b1-269c-4e52-8a5c-360d79ba776e"). InnerVolumeSpecName "kube-api-access-vb85x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:57:08 crc kubenswrapper[5008]: I1126 22:57:08.113241 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e862e5b1-269c-4e52-8a5c-360d79ba776e-config-data" (OuterVolumeSpecName: "config-data") pod "e862e5b1-269c-4e52-8a5c-360d79ba776e" (UID: "e862e5b1-269c-4e52-8a5c-360d79ba776e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:57:08 crc kubenswrapper[5008]: I1126 22:57:08.162517 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb85x\" (UniqueName: \"kubernetes.io/projected/e862e5b1-269c-4e52-8a5c-360d79ba776e-kube-api-access-vb85x\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:08 crc kubenswrapper[5008]: I1126 22:57:08.162556 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e862e5b1-269c-4e52-8a5c-360d79ba776e-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:08 crc kubenswrapper[5008]: I1126 22:57:08.162565 5008 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e862e5b1-269c-4e52-8a5c-360d79ba776e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:08 crc kubenswrapper[5008]: I1126 22:57:08.600076 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dvtcp" event={"ID":"e862e5b1-269c-4e52-8a5c-360d79ba776e","Type":"ContainerDied","Data":"5153e4b18cb7f7821f7752c41d925d42f8ebbf059b3a3207ef37e16172ff22c0"} Nov 26 22:57:08 crc kubenswrapper[5008]: I1126 22:57:08.600121 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5153e4b18cb7f7821f7752c41d925d42f8ebbf059b3a3207ef37e16172ff22c0" Nov 26 22:57:08 crc kubenswrapper[5008]: I1126 22:57:08.600187 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dvtcp" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.851302 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:57:09 crc kubenswrapper[5008]: E1126 22:57:09.851819 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e862e5b1-269c-4e52-8a5c-360d79ba776e" containerName="glance-db-sync" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.851831 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e862e5b1-269c-4e52-8a5c-360d79ba776e" containerName="glance-db-sync" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.851979 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e862e5b1-269c-4e52-8a5c-360d79ba776e" containerName="glance-db-sync" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.852629 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.857927 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.858180 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-p4jpk" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.858472 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.883213 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.884468 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.890648 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-lib-modules\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.890718 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-sys\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.890754 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-run\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.890788 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.890820 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.890851 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f7a43a-34f6-4567-9e34-6f465d4fb60a-logs\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.890949 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-dev\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.891030 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.891142 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f7a43a-34f6-4567-9e34-6f465d4fb60a-config-data\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.891197 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18f7a43a-34f6-4567-9e34-6f465d4fb60a-httpd-run\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.891269 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f7a43a-34f6-4567-9e34-6f465d4fb60a-scripts\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.891303 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-etc-nvme\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.891376 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdmpp\" (UniqueName: \"kubernetes.io/projected/18f7a43a-34f6-4567-9e34-6f465d4fb60a-kube-api-access-bdmpp\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.891562 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.892827 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.908729 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.928679 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 22:57:09 crc kubenswrapper[5008]: E1126 22:57:09.929238 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-data dev etc-iscsi etc-nvme glance glance-cache httpd-run kube-api-access-pdbr5 lib-modules logs run scripts sys var-locks-brick], unattached volumes=[], failed to process volumes=[config-data dev etc-iscsi etc-nvme glance glance-cache httpd-run kube-api-access-pdbr5 lib-modules logs run scripts sys var-locks-brick]: context canceled" pod="glance-kuttl-tests/glance-default-single-1" podUID="92b1a68c-05fc-4027-9ac8-ebf5e0355134" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.992673 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-etc-nvme\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.992724 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.992750 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-etc-nvme\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.992782 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdmpp\" (UniqueName: \"kubernetes.io/projected/18f7a43a-34f6-4567-9e34-6f465d4fb60a-kube-api-access-bdmpp\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.992809 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.992886 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-dev\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.992915 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-etc-nvme\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993040 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993097 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92b1a68c-05fc-4027-9ac8-ebf5e0355134-httpd-run\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993183 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-lib-modules\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993273 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993340 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-run\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993400 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-lib-modules\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993435 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-sys\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993465 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-lib-modules\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993470 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-run\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993526 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993543 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-sys\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993528 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-run\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993550 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993616 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993653 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f7a43a-34f6-4567-9e34-6f465d4fb60a-logs\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993674 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993699 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b1a68c-05fc-4027-9ac8-ebf5e0355134-config-data\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993721 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-sys\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993758 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b1a68c-05fc-4027-9ac8-ebf5e0355134-logs\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993836 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-dev\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993840 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993871 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-dev\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993921 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.993984 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.994021 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f7a43a-34f6-4567-9e34-6f465d4fb60a-config-data\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.994035 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18f7a43a-34f6-4567-9e34-6f465d4fb60a-httpd-run\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.994060 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdbr5\" (UniqueName: \"kubernetes.io/projected/92b1a68c-05fc-4027-9ac8-ebf5e0355134-kube-api-access-pdbr5\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.994083 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92b1a68c-05fc-4027-9ac8-ebf5e0355134-scripts\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.994105 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f7a43a-34f6-4567-9e34-6f465d4fb60a-scripts\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.994278 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.994405 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f7a43a-34f6-4567-9e34-6f465d4fb60a-logs\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:09 crc kubenswrapper[5008]: I1126 22:57:09.994497 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18f7a43a-34f6-4567-9e34-6f465d4fb60a-httpd-run\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.001078 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f7a43a-34f6-4567-9e34-6f465d4fb60a-scripts\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.001365 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f7a43a-34f6-4567-9e34-6f465d4fb60a-config-data\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.014808 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.024488 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdmpp\" (UniqueName: \"kubernetes.io/projected/18f7a43a-34f6-4567-9e34-6f465d4fb60a-kube-api-access-bdmpp\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.024796 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095341 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92b1a68c-05fc-4027-9ac8-ebf5e0355134-scripts\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095393 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095419 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-etc-nvme\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095534 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-etc-nvme\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095551 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095590 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095617 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-dev\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095658 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-dev\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095682 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095692 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095705 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92b1a68c-05fc-4027-9ac8-ebf5e0355134-httpd-run\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095843 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-lib-modules\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095854 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") device mount path \"/mnt/openstack/pv20\"" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.095890 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-lib-modules\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.096053 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-run\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.096140 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-run\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.096171 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b1a68c-05fc-4027-9ac8-ebf5e0355134-config-data\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.096207 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-sys\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.096272 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b1a68c-05fc-4027-9ac8-ebf5e0355134-logs\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.096321 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-sys\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.096390 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.096489 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdbr5\" (UniqueName: \"kubernetes.io/projected/92b1a68c-05fc-4027-9ac8-ebf5e0355134-kube-api-access-pdbr5\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.096600 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.096636 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b1a68c-05fc-4027-9ac8-ebf5e0355134-logs\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.096846 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92b1a68c-05fc-4027-9ac8-ebf5e0355134-httpd-run\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.100451 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b1a68c-05fc-4027-9ac8-ebf5e0355134-config-data\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.102838 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92b1a68c-05fc-4027-9ac8-ebf5e0355134-scripts\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.113014 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.116447 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdbr5\" (UniqueName: \"kubernetes.io/projected/92b1a68c-05fc-4027-9ac8-ebf5e0355134-kube-api-access-pdbr5\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.119394 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.180267 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.480840 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.626110 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.626303 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"18f7a43a-34f6-4567-9e34-6f465d4fb60a","Type":"ContainerStarted","Data":"ff1baf324057f47c0b819c250318a849ee597a72ee442cdb8ab3c2421a5e74e1"} Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.626360 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"18f7a43a-34f6-4567-9e34-6f465d4fb60a","Type":"ContainerStarted","Data":"00a812adae3b31eaf83b546d4f22f9a2651172a51e5c9c6be611814a402c5b0e"} Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.641329 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.808434 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.808772 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-run\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.808815 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.808839 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-sys\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.808904 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b1a68c-05fc-4027-9ac8-ebf5e0355134-logs\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.808927 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-etc-iscsi\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.808950 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92b1a68c-05fc-4027-9ac8-ebf5e0355134-scripts\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809067 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-var-locks-brick\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809097 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-lib-modules\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809135 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b1a68c-05fc-4027-9ac8-ebf5e0355134-config-data\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809158 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-run" (OuterVolumeSpecName: "run") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809175 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809204 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdbr5\" (UniqueName: \"kubernetes.io/projected/92b1a68c-05fc-4027-9ac8-ebf5e0355134-kube-api-access-pdbr5\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809217 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809264 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92b1a68c-05fc-4027-9ac8-ebf5e0355134-httpd-run\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809293 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-dev\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809335 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-etc-nvme\") pod \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\" (UID: \"92b1a68c-05fc-4027-9ac8-ebf5e0355134\") " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809544 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-sys" (OuterVolumeSpecName: "sys") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809614 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809649 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-dev" (OuterVolumeSpecName: "dev") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809670 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809665 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92b1a68c-05fc-4027-9ac8-ebf5e0355134-logs" (OuterVolumeSpecName: "logs") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809902 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92b1a68c-05fc-4027-9ac8-ebf5e0355134-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.809932 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-dev\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.810019 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.810054 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.810079 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-sys\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.810101 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b1a68c-05fc-4027-9ac8-ebf5e0355134-logs\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.810126 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.810149 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.810173 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/92b1a68c-05fc-4027-9ac8-ebf5e0355134-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.812811 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance-cache") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.813559 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92b1a68c-05fc-4027-9ac8-ebf5e0355134-config-data" (OuterVolumeSpecName: "config-data") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.813700 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.814066 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92b1a68c-05fc-4027-9ac8-ebf5e0355134-kube-api-access-pdbr5" (OuterVolumeSpecName: "kube-api-access-pdbr5") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "kube-api-access-pdbr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.815498 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92b1a68c-05fc-4027-9ac8-ebf5e0355134-scripts" (OuterVolumeSpecName: "scripts") pod "92b1a68c-05fc-4027-9ac8-ebf5e0355134" (UID: "92b1a68c-05fc-4027-9ac8-ebf5e0355134"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.912103 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92b1a68c-05fc-4027-9ac8-ebf5e0355134-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.912442 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.912469 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.912489 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92b1a68c-05fc-4027-9ac8-ebf5e0355134-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.912504 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b1a68c-05fc-4027-9ac8-ebf5e0355134-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.912521 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdbr5\" (UniqueName: \"kubernetes.io/projected/92b1a68c-05fc-4027-9ac8-ebf5e0355134-kube-api-access-pdbr5\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.931250 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Nov 26 22:57:10 crc kubenswrapper[5008]: I1126 22:57:10.932044 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.013137 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.013364 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.639614 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.642336 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"18f7a43a-34f6-4567-9e34-6f465d4fb60a","Type":"ContainerStarted","Data":"e6b71e8ebce349a5afb3de65fd2b64ca832d2cbe949aee337667fa3e8de8bb82"} Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.694099 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.6940817900000003 podStartE2EDuration="3.69408179s" podCreationTimestamp="2025-11-26 22:57:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:57:11.688711551 +0000 UTC m=+1107.101405593" watchObservedRunningTime="2025-11-26 22:57:11.69408179 +0000 UTC m=+1107.106775802" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.746195 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.751770 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.799752 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.801180 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.815205 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.923981 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cz9q\" (UniqueName: \"kubernetes.io/projected/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-kube-api-access-2cz9q\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.924019 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-sys\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.924045 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-config-data\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.924067 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.924081 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-scripts\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.924113 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-dev\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.924132 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-run\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.924157 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-logs\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.924173 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.924196 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-httpd-run\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.924224 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.924245 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.924272 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-lib-modules\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:11 crc kubenswrapper[5008]: I1126 22:57:11.924288 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-etc-nvme\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.025799 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.026126 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.032805 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-scripts\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.026474 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-scripts\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033221 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-dev\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033258 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-run\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033320 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-logs\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033353 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033406 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-httpd-run\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033438 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033441 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-run\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033476 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033506 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-dev\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033530 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-lib-modules\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033564 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-etc-nvme\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033625 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") device mount path \"/mnt/openstack/pv20\"" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033632 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cz9q\" (UniqueName: \"kubernetes.io/projected/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-kube-api-access-2cz9q\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033664 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-sys\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.033691 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-config-data\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.034144 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.034453 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-httpd-run\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.034458 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-etc-nvme\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.034483 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-sys\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.034537 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.034857 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-logs\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.035461 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-lib-modules\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.040040 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-config-data\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.061188 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.066238 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.080445 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cz9q\" (UniqueName: \"kubernetes.io/projected/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-kube-api-access-2cz9q\") pod \"glance-default-single-1\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.128841 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.375935 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.657136 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"f769aa7e-3bd6-4c79-aa02-8d2df1da4133","Type":"ContainerStarted","Data":"5d66dae61076a74e1b092908db303873156ed2e6ca6e2a1e86d3b0933a7ccc76"} Nov 26 22:57:12 crc kubenswrapper[5008]: I1126 22:57:12.657576 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"f769aa7e-3bd6-4c79-aa02-8d2df1da4133","Type":"ContainerStarted","Data":"2a9ac46f686308e45bb2fa2ad4add4f5d17bd0358f466b8bf0783f2138a38e8f"} Nov 26 22:57:13 crc kubenswrapper[5008]: I1126 22:57:13.534611 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92b1a68c-05fc-4027-9ac8-ebf5e0355134" path="/var/lib/kubelet/pods/92b1a68c-05fc-4027-9ac8-ebf5e0355134/volumes" Nov 26 22:57:13 crc kubenswrapper[5008]: I1126 22:57:13.667325 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"f769aa7e-3bd6-4c79-aa02-8d2df1da4133","Type":"ContainerStarted","Data":"77ce72c19ff94751d7fca414fa1e538a590c3d6b6000b1f3c480efa69975a4bb"} Nov 26 22:57:13 crc kubenswrapper[5008]: I1126 22:57:13.694386 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=2.694372199 podStartE2EDuration="2.694372199s" podCreationTimestamp="2025-11-26 22:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:57:13.692053576 +0000 UTC m=+1109.104747588" watchObservedRunningTime="2025-11-26 22:57:13.694372199 +0000 UTC m=+1109.107066201" Nov 26 22:57:20 crc kubenswrapper[5008]: I1126 22:57:20.181444 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:20 crc kubenswrapper[5008]: I1126 22:57:20.183596 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:20 crc kubenswrapper[5008]: I1126 22:57:20.228888 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:20 crc kubenswrapper[5008]: I1126 22:57:20.246255 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:20 crc kubenswrapper[5008]: I1126 22:57:20.737662 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:20 crc kubenswrapper[5008]: I1126 22:57:20.737751 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:22 crc kubenswrapper[5008]: I1126 22:57:22.129595 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:22 crc kubenswrapper[5008]: I1126 22:57:22.131005 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:22 crc kubenswrapper[5008]: I1126 22:57:22.176775 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:22 crc kubenswrapper[5008]: I1126 22:57:22.189461 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:22 crc kubenswrapper[5008]: I1126 22:57:22.665486 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:22 crc kubenswrapper[5008]: I1126 22:57:22.688713 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:22 crc kubenswrapper[5008]: I1126 22:57:22.757226 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:22 crc kubenswrapper[5008]: I1126 22:57:22.757767 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:24 crc kubenswrapper[5008]: I1126 22:57:24.717007 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:24 crc kubenswrapper[5008]: I1126 22:57:24.735525 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:57:24 crc kubenswrapper[5008]: I1126 22:57:24.807852 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:57:24 crc kubenswrapper[5008]: I1126 22:57:24.808393 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="18f7a43a-34f6-4567-9e34-6f465d4fb60a" containerName="glance-log" containerID="cri-o://ff1baf324057f47c0b819c250318a849ee597a72ee442cdb8ab3c2421a5e74e1" gracePeriod=30 Nov 26 22:57:24 crc kubenswrapper[5008]: I1126 22:57:24.808572 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="18f7a43a-34f6-4567-9e34-6f465d4fb60a" containerName="glance-httpd" containerID="cri-o://e6b71e8ebce349a5afb3de65fd2b64ca832d2cbe949aee337667fa3e8de8bb82" gracePeriod=30 Nov 26 22:57:25 crc kubenswrapper[5008]: I1126 22:57:25.789678 5008 generic.go:334] "Generic (PLEG): container finished" podID="18f7a43a-34f6-4567-9e34-6f465d4fb60a" containerID="ff1baf324057f47c0b819c250318a849ee597a72ee442cdb8ab3c2421a5e74e1" exitCode=143 Nov 26 22:57:25 crc kubenswrapper[5008]: I1126 22:57:25.789987 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"18f7a43a-34f6-4567-9e34-6f465d4fb60a","Type":"ContainerDied","Data":"ff1baf324057f47c0b819c250318a849ee597a72ee442cdb8ab3c2421a5e74e1"} Nov 26 22:57:28 crc kubenswrapper[5008]: I1126 22:57:28.819794 5008 generic.go:334] "Generic (PLEG): container finished" podID="18f7a43a-34f6-4567-9e34-6f465d4fb60a" containerID="e6b71e8ebce349a5afb3de65fd2b64ca832d2cbe949aee337667fa3e8de8bb82" exitCode=0 Nov 26 22:57:28 crc kubenswrapper[5008]: I1126 22:57:28.819892 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"18f7a43a-34f6-4567-9e34-6f465d4fb60a","Type":"ContainerDied","Data":"e6b71e8ebce349a5afb3de65fd2b64ca832d2cbe949aee337667fa3e8de8bb82"} Nov 26 22:57:29 crc kubenswrapper[5008]: I1126 22:57:29.281495 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:57:29 crc kubenswrapper[5008]: I1126 22:57:29.281590 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.129192 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.143783 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdmpp\" (UniqueName: \"kubernetes.io/projected/18f7a43a-34f6-4567-9e34-6f465d4fb60a-kube-api-access-bdmpp\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.144048 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-etc-nvme\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.144131 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f7a43a-34f6-4567-9e34-6f465d4fb60a-scripts\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.144209 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.144595 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.149673 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f7a43a-34f6-4567-9e34-6f465d4fb60a-scripts" (OuterVolumeSpecName: "scripts") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.150425 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f7a43a-34f6-4567-9e34-6f465d4fb60a-kube-api-access-bdmpp" (OuterVolumeSpecName: "kube-api-access-bdmpp") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "kube-api-access-bdmpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246126 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246212 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246255 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-sys\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246284 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18f7a43a-34f6-4567-9e34-6f465d4fb60a-httpd-run\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246329 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f7a43a-34f6-4567-9e34-6f465d4fb60a-logs\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246401 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-dev\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246399 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-sys" (OuterVolumeSpecName: "sys") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246438 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-run\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246457 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-etc-iscsi\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246479 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-lib-modules\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246514 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-var-locks-brick\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246600 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f7a43a-34f6-4567-9e34-6f465d4fb60a-config-data\") pod \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\" (UID: \"18f7a43a-34f6-4567-9e34-6f465d4fb60a\") " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246916 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f7a43a-34f6-4567-9e34-6f465d4fb60a-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246932 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdmpp\" (UniqueName: \"kubernetes.io/projected/18f7a43a-34f6-4567-9e34-6f465d4fb60a-kube-api-access-bdmpp\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246945 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-sys\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.246956 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18f7a43a-34f6-4567-9e34-6f465d4fb60a-logs" (OuterVolumeSpecName: "logs") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.247065 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18f7a43a-34f6-4567-9e34-6f465d4fb60a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.247115 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-dev" (OuterVolumeSpecName: "dev") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.247085 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.247150 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-run" (OuterVolumeSpecName: "run") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.247188 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.247186 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.249997 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance-cache") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.251172 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.290108 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f7a43a-34f6-4567-9e34-6f465d4fb60a-config-data" (OuterVolumeSpecName: "config-data") pod "18f7a43a-34f6-4567-9e34-6f465d4fb60a" (UID: "18f7a43a-34f6-4567-9e34-6f465d4fb60a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.349011 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.349072 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.349100 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18f7a43a-34f6-4567-9e34-6f465d4fb60a-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.349121 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f7a43a-34f6-4567-9e34-6f465d4fb60a-logs\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.349145 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-dev\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.349166 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.349186 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.349209 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.349231 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/18f7a43a-34f6-4567-9e34-6f465d4fb60a-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.349256 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f7a43a-34f6-4567-9e34-6f465d4fb60a-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.373089 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.376134 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.450740 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.450823 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.838689 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"18f7a43a-34f6-4567-9e34-6f465d4fb60a","Type":"ContainerDied","Data":"00a812adae3b31eaf83b546d4f22f9a2651172a51e5c9c6be611814a402c5b0e"} Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.838745 5008 scope.go:117] "RemoveContainer" containerID="e6b71e8ebce349a5afb3de65fd2b64ca832d2cbe949aee337667fa3e8de8bb82" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.838813 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.873163 5008 scope.go:117] "RemoveContainer" containerID="ff1baf324057f47c0b819c250318a849ee597a72ee442cdb8ab3c2421a5e74e1" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.892448 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.899952 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.927781 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:57:30 crc kubenswrapper[5008]: E1126 22:57:30.928170 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f7a43a-34f6-4567-9e34-6f465d4fb60a" containerName="glance-log" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.928198 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f7a43a-34f6-4567-9e34-6f465d4fb60a" containerName="glance-log" Nov 26 22:57:30 crc kubenswrapper[5008]: E1126 22:57:30.928224 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f7a43a-34f6-4567-9e34-6f465d4fb60a" containerName="glance-httpd" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.928235 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f7a43a-34f6-4567-9e34-6f465d4fb60a" containerName="glance-httpd" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.928435 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f7a43a-34f6-4567-9e34-6f465d4fb60a" containerName="glance-log" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.928461 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f7a43a-34f6-4567-9e34-6f465d4fb60a" containerName="glance-httpd" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.929328 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.944773 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:57:30 crc kubenswrapper[5008]: I1126 22:57:30.957539 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.058825 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.058877 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-dev\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.058899 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-lib-modules\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.058928 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkk44\" (UniqueName: \"kubernetes.io/projected/052726a0-d05a-4635-b963-5839f5554297-kube-api-access-vkk44\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.059526 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-run\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.059803 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-sys\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.059935 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/052726a0-d05a-4635-b963-5839f5554297-logs\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.059992 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/052726a0-d05a-4635-b963-5839f5554297-scripts\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.060082 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-etc-nvme\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.060125 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.060386 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/052726a0-d05a-4635-b963-5839f5554297-config-data\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.060439 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.060470 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/052726a0-d05a-4635-b963-5839f5554297-httpd-run\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.060570 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.060791 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.109444 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.162845 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-dev\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.162894 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-lib-modules\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.162947 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkk44\" (UniqueName: \"kubernetes.io/projected/052726a0-d05a-4635-b963-5839f5554297-kube-api-access-vkk44\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.162993 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-dev\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.163020 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-lib-modules\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.163099 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-run\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.163131 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-sys\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.163177 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-run\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.163195 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/052726a0-d05a-4635-b963-5839f5554297-logs\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.163209 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-sys\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.163240 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/052726a0-d05a-4635-b963-5839f5554297-scripts\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.163269 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-etc-nvme\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.163499 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-etc-nvme\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.163591 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/052726a0-d05a-4635-b963-5839f5554297-logs\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.163910 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.164049 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.164070 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/052726a0-d05a-4635-b963-5839f5554297-config-data\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.164110 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.164129 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/052726a0-d05a-4635-b963-5839f5554297-httpd-run\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.164221 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.164298 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.164498 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/052726a0-d05a-4635-b963-5839f5554297-httpd-run\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.164545 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.167576 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/052726a0-d05a-4635-b963-5839f5554297-config-data\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.172394 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/052726a0-d05a-4635-b963-5839f5554297-scripts\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.184944 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkk44\" (UniqueName: \"kubernetes.io/projected/052726a0-d05a-4635-b963-5839f5554297-kube-api-access-vkk44\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.196441 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.264097 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.535718 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f7a43a-34f6-4567-9e34-6f465d4fb60a" path="/var/lib/kubelet/pods/18f7a43a-34f6-4567-9e34-6f465d4fb60a/volumes" Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.549511 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.860791 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"052726a0-d05a-4635-b963-5839f5554297","Type":"ContainerStarted","Data":"8603f85031ba40914d07f90db4065fa3f2623650229e114e876af3561d209a75"} Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.861453 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"052726a0-d05a-4635-b963-5839f5554297","Type":"ContainerStarted","Data":"5f09907fa9825b233b6a6c79f73d1ad5eebc4391dd85be5b797207577fbb4633"} Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.861604 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"052726a0-d05a-4635-b963-5839f5554297","Type":"ContainerStarted","Data":"caf7a16d5d2d52f85509a332421d74035a0301fb0fc3bd63103e717df7ab4934"} Nov 26 22:57:31 crc kubenswrapper[5008]: I1126 22:57:31.887720 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=1.887695322 podStartE2EDuration="1.887695322s" podCreationTimestamp="2025-11-26 22:57:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:57:31.883139438 +0000 UTC m=+1127.295833480" watchObservedRunningTime="2025-11-26 22:57:31.887695322 +0000 UTC m=+1127.300389354" Nov 26 22:57:41 crc kubenswrapper[5008]: I1126 22:57:41.265211 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:41 crc kubenswrapper[5008]: I1126 22:57:41.265867 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:41 crc kubenswrapper[5008]: I1126 22:57:41.308567 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:41 crc kubenswrapper[5008]: I1126 22:57:41.350805 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:41 crc kubenswrapper[5008]: I1126 22:57:41.990878 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:41 crc kubenswrapper[5008]: I1126 22:57:41.992369 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:43 crc kubenswrapper[5008]: I1126 22:57:43.817162 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:43 crc kubenswrapper[5008]: I1126 22:57:43.828039 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.674645 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dvtcp"] Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.686412 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dvtcp"] Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.789772 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-2nxfj"] Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.790734 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.793793 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.793812 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.801596 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-2nxfj"] Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.880664 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-config-data\") pod \"glance-db-sync-2nxfj\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.881088 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-combined-ca-bundle\") pod \"glance-db-sync-2nxfj\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.881309 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf9nd\" (UniqueName: \"kubernetes.io/projected/1697aef7-c233-444e-bc97-5a0ac3cd5817-kube-api-access-vf9nd\") pod \"glance-db-sync-2nxfj\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.881354 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-db-sync-config-data\") pod \"glance-db-sync-2nxfj\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.982691 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf9nd\" (UniqueName: \"kubernetes.io/projected/1697aef7-c233-444e-bc97-5a0ac3cd5817-kube-api-access-vf9nd\") pod \"glance-db-sync-2nxfj\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.982730 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-db-sync-config-data\") pod \"glance-db-sync-2nxfj\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.982789 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-config-data\") pod \"glance-db-sync-2nxfj\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.982847 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-combined-ca-bundle\") pod \"glance-db-sync-2nxfj\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.989173 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-config-data\") pod \"glance-db-sync-2nxfj\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.989258 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-db-sync-config-data\") pod \"glance-db-sync-2nxfj\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:54 crc kubenswrapper[5008]: I1126 22:57:54.989889 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-combined-ca-bundle\") pod \"glance-db-sync-2nxfj\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:55 crc kubenswrapper[5008]: I1126 22:57:55.009722 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf9nd\" (UniqueName: \"kubernetes.io/projected/1697aef7-c233-444e-bc97-5a0ac3cd5817-kube-api-access-vf9nd\") pod \"glance-db-sync-2nxfj\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:55 crc kubenswrapper[5008]: I1126 22:57:55.130248 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:57:55 crc kubenswrapper[5008]: I1126 22:57:55.438104 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-2nxfj"] Nov 26 22:57:55 crc kubenswrapper[5008]: I1126 22:57:55.538727 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e862e5b1-269c-4e52-8a5c-360d79ba776e" path="/var/lib/kubelet/pods/e862e5b1-269c-4e52-8a5c-360d79ba776e/volumes" Nov 26 22:57:56 crc kubenswrapper[5008]: I1126 22:57:56.125067 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-2nxfj" event={"ID":"1697aef7-c233-444e-bc97-5a0ac3cd5817","Type":"ContainerStarted","Data":"4cedf9a75953e38f7129d2ca3aa32f29355e72364fe71030a003ff23737c8645"} Nov 26 22:57:57 crc kubenswrapper[5008]: I1126 22:57:57.137919 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-2nxfj" event={"ID":"1697aef7-c233-444e-bc97-5a0ac3cd5817","Type":"ContainerStarted","Data":"8ee068d81de26d19fe2d88e0d3208123230600f6994cd80f1dd355edf03cf6cb"} Nov 26 22:57:57 crc kubenswrapper[5008]: I1126 22:57:57.161222 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-2nxfj" podStartSLOduration=3.161205424 podStartE2EDuration="3.161205424s" podCreationTimestamp="2025-11-26 22:57:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:57:57.157301871 +0000 UTC m=+1152.569995903" watchObservedRunningTime="2025-11-26 22:57:57.161205424 +0000 UTC m=+1152.573899426" Nov 26 22:57:59 crc kubenswrapper[5008]: I1126 22:57:59.161125 5008 generic.go:334] "Generic (PLEG): container finished" podID="1697aef7-c233-444e-bc97-5a0ac3cd5817" containerID="8ee068d81de26d19fe2d88e0d3208123230600f6994cd80f1dd355edf03cf6cb" exitCode=0 Nov 26 22:57:59 crc kubenswrapper[5008]: I1126 22:57:59.161168 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-2nxfj" event={"ID":"1697aef7-c233-444e-bc97-5a0ac3cd5817","Type":"ContainerDied","Data":"8ee068d81de26d19fe2d88e0d3208123230600f6994cd80f1dd355edf03cf6cb"} Nov 26 22:57:59 crc kubenswrapper[5008]: I1126 22:57:59.281002 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:57:59 crc kubenswrapper[5008]: I1126 22:57:59.281090 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 22:57:59 crc kubenswrapper[5008]: I1126 22:57:59.281156 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 22:57:59 crc kubenswrapper[5008]: I1126 22:57:59.282073 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9fe41703f50fb88cb25a2da55af0500e1caf0de96a6d4c9a6682af82c31219e6"} pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 22:57:59 crc kubenswrapper[5008]: I1126 22:57:59.282181 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" containerID="cri-o://9fe41703f50fb88cb25a2da55af0500e1caf0de96a6d4c9a6682af82c31219e6" gracePeriod=600 Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.174308 5008 generic.go:334] "Generic (PLEG): container finished" podID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerID="9fe41703f50fb88cb25a2da55af0500e1caf0de96a6d4c9a6682af82c31219e6" exitCode=0 Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.174380 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerDied","Data":"9fe41703f50fb88cb25a2da55af0500e1caf0de96a6d4c9a6682af82c31219e6"} Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.174985 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerStarted","Data":"9fefa5cb673a1b6294b5289afba9c13ed72e7092fa22ba3e5f1ef5b55c16e305"} Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.175009 5008 scope.go:117] "RemoveContainer" containerID="7996ba3586721dd9a32b1864cf2a2c0cd8e592ad6992d5bba7e5d7e532bd504c" Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.485052 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.567568 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-config-data\") pod \"1697aef7-c233-444e-bc97-5a0ac3cd5817\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.567641 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-db-sync-config-data\") pod \"1697aef7-c233-444e-bc97-5a0ac3cd5817\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.567690 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-combined-ca-bundle\") pod \"1697aef7-c233-444e-bc97-5a0ac3cd5817\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.567735 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf9nd\" (UniqueName: \"kubernetes.io/projected/1697aef7-c233-444e-bc97-5a0ac3cd5817-kube-api-access-vf9nd\") pod \"1697aef7-c233-444e-bc97-5a0ac3cd5817\" (UID: \"1697aef7-c233-444e-bc97-5a0ac3cd5817\") " Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.574921 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1697aef7-c233-444e-bc97-5a0ac3cd5817-kube-api-access-vf9nd" (OuterVolumeSpecName: "kube-api-access-vf9nd") pod "1697aef7-c233-444e-bc97-5a0ac3cd5817" (UID: "1697aef7-c233-444e-bc97-5a0ac3cd5817"). InnerVolumeSpecName "kube-api-access-vf9nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.575701 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1697aef7-c233-444e-bc97-5a0ac3cd5817" (UID: "1697aef7-c233-444e-bc97-5a0ac3cd5817"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.590387 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1697aef7-c233-444e-bc97-5a0ac3cd5817" (UID: "1697aef7-c233-444e-bc97-5a0ac3cd5817"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.607482 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-config-data" (OuterVolumeSpecName: "config-data") pod "1697aef7-c233-444e-bc97-5a0ac3cd5817" (UID: "1697aef7-c233-444e-bc97-5a0ac3cd5817"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.669440 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.669486 5008 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.669501 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1697aef7-c233-444e-bc97-5a0ac3cd5817-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:00 crc kubenswrapper[5008]: I1126 22:58:00.669512 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf9nd\" (UniqueName: \"kubernetes.io/projected/1697aef7-c233-444e-bc97-5a0ac3cd5817-kube-api-access-vf9nd\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:01 crc kubenswrapper[5008]: I1126 22:58:01.190297 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-2nxfj" event={"ID":"1697aef7-c233-444e-bc97-5a0ac3cd5817","Type":"ContainerDied","Data":"4cedf9a75953e38f7129d2ca3aa32f29355e72364fe71030a003ff23737c8645"} Nov 26 22:58:01 crc kubenswrapper[5008]: I1126 22:58:01.190355 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-2nxfj" Nov 26 22:58:01 crc kubenswrapper[5008]: I1126 22:58:01.190368 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cedf9a75953e38f7129d2ca3aa32f29355e72364fe71030a003ff23737c8645" Nov 26 22:58:01 crc kubenswrapper[5008]: I1126 22:58:01.385557 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 22:58:01 crc kubenswrapper[5008]: I1126 22:58:01.386391 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="f769aa7e-3bd6-4c79-aa02-8d2df1da4133" containerName="glance-log" containerID="cri-o://5d66dae61076a74e1b092908db303873156ed2e6ca6e2a1e86d3b0933a7ccc76" gracePeriod=30 Nov 26 22:58:01 crc kubenswrapper[5008]: I1126 22:58:01.386521 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="f769aa7e-3bd6-4c79-aa02-8d2df1da4133" containerName="glance-httpd" containerID="cri-o://77ce72c19ff94751d7fca414fa1e538a590c3d6b6000b1f3c480efa69975a4bb" gracePeriod=30 Nov 26 22:58:01 crc kubenswrapper[5008]: I1126 22:58:01.399351 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:58:01 crc kubenswrapper[5008]: I1126 22:58:01.399632 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="052726a0-d05a-4635-b963-5839f5554297" containerName="glance-log" containerID="cri-o://5f09907fa9825b233b6a6c79f73d1ad5eebc4391dd85be5b797207577fbb4633" gracePeriod=30 Nov 26 22:58:01 crc kubenswrapper[5008]: I1126 22:58:01.399766 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="052726a0-d05a-4635-b963-5839f5554297" containerName="glance-httpd" containerID="cri-o://8603f85031ba40914d07f90db4065fa3f2623650229e114e876af3561d209a75" gracePeriod=30 Nov 26 22:58:02 crc kubenswrapper[5008]: I1126 22:58:02.199834 5008 generic.go:334] "Generic (PLEG): container finished" podID="f769aa7e-3bd6-4c79-aa02-8d2df1da4133" containerID="5d66dae61076a74e1b092908db303873156ed2e6ca6e2a1e86d3b0933a7ccc76" exitCode=143 Nov 26 22:58:02 crc kubenswrapper[5008]: I1126 22:58:02.199935 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"f769aa7e-3bd6-4c79-aa02-8d2df1da4133","Type":"ContainerDied","Data":"5d66dae61076a74e1b092908db303873156ed2e6ca6e2a1e86d3b0933a7ccc76"} Nov 26 22:58:02 crc kubenswrapper[5008]: I1126 22:58:02.202920 5008 generic.go:334] "Generic (PLEG): container finished" podID="052726a0-d05a-4635-b963-5839f5554297" containerID="5f09907fa9825b233b6a6c79f73d1ad5eebc4391dd85be5b797207577fbb4633" exitCode=143 Nov 26 22:58:02 crc kubenswrapper[5008]: I1126 22:58:02.202983 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"052726a0-d05a-4635-b963-5839f5554297","Type":"ContainerDied","Data":"5f09907fa9825b233b6a6c79f73d1ad5eebc4391dd85be5b797207577fbb4633"} Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.013442 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.019200 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042017 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/052726a0-d05a-4635-b963-5839f5554297-config-data\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042088 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-scripts\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042126 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-dev\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042158 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-run\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042209 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-httpd-run\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042242 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-lib-modules\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042290 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-sys\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042355 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042362 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042387 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042374 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-run" (OuterVolumeSpecName: "run") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042417 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-etc-iscsi\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042428 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-sys" (OuterVolumeSpecName: "sys") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042449 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-sys\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042485 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cz9q\" (UniqueName: \"kubernetes.io/projected/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-kube-api-access-2cz9q\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042488 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042544 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-etc-nvme\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042605 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-lib-modules\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042645 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-etc-iscsi\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042688 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-logs\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042717 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042751 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkk44\" (UniqueName: \"kubernetes.io/projected/052726a0-d05a-4635-b963-5839f5554297-kube-api-access-vkk44\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042791 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-var-locks-brick\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042817 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042826 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.042986 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-dev\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.043029 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-etc-nvme\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.043059 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-var-locks-brick\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.043117 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/052726a0-d05a-4635-b963-5839f5554297-httpd-run\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.043184 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-config-data\") pod \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\" (UID: \"f769aa7e-3bd6-4c79-aa02-8d2df1da4133\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.043221 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/052726a0-d05a-4635-b963-5839f5554297-logs\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.043250 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-run\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.043289 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/052726a0-d05a-4635-b963-5839f5554297-scripts\") pod \"052726a0-d05a-4635-b963-5839f5554297\" (UID: \"052726a0-d05a-4635-b963-5839f5554297\") " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.043925 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.043946 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.044012 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.044032 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-sys\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.044052 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.043021 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-dev" (OuterVolumeSpecName: "dev") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.043050 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.043112 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.043143 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.045707 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-sys" (OuterVolumeSpecName: "sys") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.045880 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.045923 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-dev" (OuterVolumeSpecName: "dev") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.046064 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.046366 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.046661 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-logs" (OuterVolumeSpecName: "logs") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.047422 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/052726a0-d05a-4635-b963-5839f5554297-logs" (OuterVolumeSpecName: "logs") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.047877 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/052726a0-d05a-4635-b963-5839f5554297-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.047933 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-run" (OuterVolumeSpecName: "run") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.050320 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-scripts" (OuterVolumeSpecName: "scripts") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.052180 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/052726a0-d05a-4635-b963-5839f5554297-kube-api-access-vkk44" (OuterVolumeSpecName: "kube-api-access-vkk44") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "kube-api-access-vkk44". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.053885 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance-cache") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.053995 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.054800 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance-cache") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.058809 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.059413 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-kube-api-access-2cz9q" (OuterVolumeSpecName: "kube-api-access-2cz9q") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "kube-api-access-2cz9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.061030 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052726a0-d05a-4635-b963-5839f5554297-scripts" (OuterVolumeSpecName: "scripts") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.123239 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052726a0-d05a-4635-b963-5839f5554297-config-data" (OuterVolumeSpecName: "config-data") pod "052726a0-d05a-4635-b963-5839f5554297" (UID: "052726a0-d05a-4635-b963-5839f5554297"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.140831 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-config-data" (OuterVolumeSpecName: "config-data") pod "f769aa7e-3bd6-4c79-aa02-8d2df1da4133" (UID: "f769aa7e-3bd6-4c79-aa02-8d2df1da4133"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145657 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/052726a0-d05a-4635-b963-5839f5554297-logs\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145690 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145698 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/052726a0-d05a-4635-b963-5839f5554297-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145709 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/052726a0-d05a-4635-b963-5839f5554297-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145718 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145727 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-dev\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145763 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145778 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145787 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-sys\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145796 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cz9q\" (UniqueName: \"kubernetes.io/projected/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-kube-api-access-2cz9q\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145805 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145812 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145820 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145829 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-logs\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145843 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145851 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkk44\" (UniqueName: \"kubernetes.io/projected/052726a0-d05a-4635-b963-5839f5554297-kube-api-access-vkk44\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145860 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/052726a0-d05a-4635-b963-5839f5554297-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145874 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145882 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-dev\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145891 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145899 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145906 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/052726a0-d05a-4635-b963-5839f5554297-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.145914 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f769aa7e-3bd6-4c79-aa02-8d2df1da4133-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.159274 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.160021 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.165205 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.167398 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.238198 5008 generic.go:334] "Generic (PLEG): container finished" podID="f769aa7e-3bd6-4c79-aa02-8d2df1da4133" containerID="77ce72c19ff94751d7fca414fa1e538a590c3d6b6000b1f3c480efa69975a4bb" exitCode=0 Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.238314 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"f769aa7e-3bd6-4c79-aa02-8d2df1da4133","Type":"ContainerDied","Data":"77ce72c19ff94751d7fca414fa1e538a590c3d6b6000b1f3c480efa69975a4bb"} Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.238385 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"f769aa7e-3bd6-4c79-aa02-8d2df1da4133","Type":"ContainerDied","Data":"2a9ac46f686308e45bb2fa2ad4add4f5d17bd0358f466b8bf0783f2138a38e8f"} Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.238389 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.238418 5008 scope.go:117] "RemoveContainer" containerID="77ce72c19ff94751d7fca414fa1e538a590c3d6b6000b1f3c480efa69975a4bb" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.241959 5008 generic.go:334] "Generic (PLEG): container finished" podID="052726a0-d05a-4635-b963-5839f5554297" containerID="8603f85031ba40914d07f90db4065fa3f2623650229e114e876af3561d209a75" exitCode=0 Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.242108 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"052726a0-d05a-4635-b963-5839f5554297","Type":"ContainerDied","Data":"8603f85031ba40914d07f90db4065fa3f2623650229e114e876af3561d209a75"} Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.242154 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"052726a0-d05a-4635-b963-5839f5554297","Type":"ContainerDied","Data":"caf7a16d5d2d52f85509a332421d74035a0301fb0fc3bd63103e717df7ab4934"} Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.242247 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.246879 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.246917 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.246930 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.246942 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.266431 5008 scope.go:117] "RemoveContainer" containerID="5d66dae61076a74e1b092908db303873156ed2e6ca6e2a1e86d3b0933a7ccc76" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.291702 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.295893 5008 scope.go:117] "RemoveContainer" containerID="77ce72c19ff94751d7fca414fa1e538a590c3d6b6000b1f3c480efa69975a4bb" Nov 26 22:58:05 crc kubenswrapper[5008]: E1126 22:58:05.296726 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77ce72c19ff94751d7fca414fa1e538a590c3d6b6000b1f3c480efa69975a4bb\": container with ID starting with 77ce72c19ff94751d7fca414fa1e538a590c3d6b6000b1f3c480efa69975a4bb not found: ID does not exist" containerID="77ce72c19ff94751d7fca414fa1e538a590c3d6b6000b1f3c480efa69975a4bb" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.296787 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77ce72c19ff94751d7fca414fa1e538a590c3d6b6000b1f3c480efa69975a4bb"} err="failed to get container status \"77ce72c19ff94751d7fca414fa1e538a590c3d6b6000b1f3c480efa69975a4bb\": rpc error: code = NotFound desc = could not find container \"77ce72c19ff94751d7fca414fa1e538a590c3d6b6000b1f3c480efa69975a4bb\": container with ID starting with 77ce72c19ff94751d7fca414fa1e538a590c3d6b6000b1f3c480efa69975a4bb not found: ID does not exist" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.296821 5008 scope.go:117] "RemoveContainer" containerID="5d66dae61076a74e1b092908db303873156ed2e6ca6e2a1e86d3b0933a7ccc76" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.297747 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:58:05 crc kubenswrapper[5008]: E1126 22:58:05.299328 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d66dae61076a74e1b092908db303873156ed2e6ca6e2a1e86d3b0933a7ccc76\": container with ID starting with 5d66dae61076a74e1b092908db303873156ed2e6ca6e2a1e86d3b0933a7ccc76 not found: ID does not exist" containerID="5d66dae61076a74e1b092908db303873156ed2e6ca6e2a1e86d3b0933a7ccc76" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.299370 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d66dae61076a74e1b092908db303873156ed2e6ca6e2a1e86d3b0933a7ccc76"} err="failed to get container status \"5d66dae61076a74e1b092908db303873156ed2e6ca6e2a1e86d3b0933a7ccc76\": rpc error: code = NotFound desc = could not find container \"5d66dae61076a74e1b092908db303873156ed2e6ca6e2a1e86d3b0933a7ccc76\": container with ID starting with 5d66dae61076a74e1b092908db303873156ed2e6ca6e2a1e86d3b0933a7ccc76 not found: ID does not exist" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.299401 5008 scope.go:117] "RemoveContainer" containerID="8603f85031ba40914d07f90db4065fa3f2623650229e114e876af3561d209a75" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.307861 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.313875 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.334421 5008 scope.go:117] "RemoveContainer" containerID="5f09907fa9825b233b6a6c79f73d1ad5eebc4391dd85be5b797207577fbb4633" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.355553 5008 scope.go:117] "RemoveContainer" containerID="8603f85031ba40914d07f90db4065fa3f2623650229e114e876af3561d209a75" Nov 26 22:58:05 crc kubenswrapper[5008]: E1126 22:58:05.355837 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8603f85031ba40914d07f90db4065fa3f2623650229e114e876af3561d209a75\": container with ID starting with 8603f85031ba40914d07f90db4065fa3f2623650229e114e876af3561d209a75 not found: ID does not exist" containerID="8603f85031ba40914d07f90db4065fa3f2623650229e114e876af3561d209a75" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.355873 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8603f85031ba40914d07f90db4065fa3f2623650229e114e876af3561d209a75"} err="failed to get container status \"8603f85031ba40914d07f90db4065fa3f2623650229e114e876af3561d209a75\": rpc error: code = NotFound desc = could not find container \"8603f85031ba40914d07f90db4065fa3f2623650229e114e876af3561d209a75\": container with ID starting with 8603f85031ba40914d07f90db4065fa3f2623650229e114e876af3561d209a75 not found: ID does not exist" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.355899 5008 scope.go:117] "RemoveContainer" containerID="5f09907fa9825b233b6a6c79f73d1ad5eebc4391dd85be5b797207577fbb4633" Nov 26 22:58:05 crc kubenswrapper[5008]: E1126 22:58:05.356433 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f09907fa9825b233b6a6c79f73d1ad5eebc4391dd85be5b797207577fbb4633\": container with ID starting with 5f09907fa9825b233b6a6c79f73d1ad5eebc4391dd85be5b797207577fbb4633 not found: ID does not exist" containerID="5f09907fa9825b233b6a6c79f73d1ad5eebc4391dd85be5b797207577fbb4633" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.356454 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f09907fa9825b233b6a6c79f73d1ad5eebc4391dd85be5b797207577fbb4633"} err="failed to get container status \"5f09907fa9825b233b6a6c79f73d1ad5eebc4391dd85be5b797207577fbb4633\": rpc error: code = NotFound desc = could not find container \"5f09907fa9825b233b6a6c79f73d1ad5eebc4391dd85be5b797207577fbb4633\": container with ID starting with 5f09907fa9825b233b6a6c79f73d1ad5eebc4391dd85be5b797207577fbb4633 not found: ID does not exist" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.534616 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="052726a0-d05a-4635-b963-5839f5554297" path="/var/lib/kubelet/pods/052726a0-d05a-4635-b963-5839f5554297/volumes" Nov 26 22:58:05 crc kubenswrapper[5008]: I1126 22:58:05.536076 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f769aa7e-3bd6-4c79-aa02-8d2df1da4133" path="/var/lib/kubelet/pods/f769aa7e-3bd6-4c79-aa02-8d2df1da4133/volumes" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.826132 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:58:06 crc kubenswrapper[5008]: E1126 22:58:06.829476 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f769aa7e-3bd6-4c79-aa02-8d2df1da4133" containerName="glance-log" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.829513 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f769aa7e-3bd6-4c79-aa02-8d2df1da4133" containerName="glance-log" Nov 26 22:58:06 crc kubenswrapper[5008]: E1126 22:58:06.829590 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="052726a0-d05a-4635-b963-5839f5554297" containerName="glance-httpd" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.829605 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="052726a0-d05a-4635-b963-5839f5554297" containerName="glance-httpd" Nov 26 22:58:06 crc kubenswrapper[5008]: E1126 22:58:06.829678 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f769aa7e-3bd6-4c79-aa02-8d2df1da4133" containerName="glance-httpd" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.829696 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f769aa7e-3bd6-4c79-aa02-8d2df1da4133" containerName="glance-httpd" Nov 26 22:58:06 crc kubenswrapper[5008]: E1126 22:58:06.829771 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1697aef7-c233-444e-bc97-5a0ac3cd5817" containerName="glance-db-sync" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.829789 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1697aef7-c233-444e-bc97-5a0ac3cd5817" containerName="glance-db-sync" Nov 26 22:58:06 crc kubenswrapper[5008]: E1126 22:58:06.829812 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="052726a0-d05a-4635-b963-5839f5554297" containerName="glance-log" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.829824 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="052726a0-d05a-4635-b963-5839f5554297" containerName="glance-log" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.830093 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f769aa7e-3bd6-4c79-aa02-8d2df1da4133" containerName="glance-httpd" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.830122 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="052726a0-d05a-4635-b963-5839f5554297" containerName="glance-log" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.830140 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="052726a0-d05a-4635-b963-5839f5554297" containerName="glance-httpd" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.830158 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f769aa7e-3bd6-4c79-aa02-8d2df1da4133" containerName="glance-log" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.830172 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1697aef7-c233-444e-bc97-5a0ac3cd5817" containerName="glance-db-sync" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.834230 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.837291 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-p4jpk" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.837476 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.837510 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.837478 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.838451 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.839725 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.839797 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.872707 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.872805 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.872841 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.872875 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.872914 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce8f090c-d792-47f3-b734-5c59d4762eb9-httpd-run\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.873022 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce8f090c-d792-47f3-b734-5c59d4762eb9-logs\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.873071 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.873122 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.873167 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9vrp\" (UniqueName: \"kubernetes.io/projected/ce8f090c-d792-47f3-b734-5c59d4762eb9-kube-api-access-j9vrp\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.974257 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.974345 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.974396 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9vrp\" (UniqueName: \"kubernetes.io/projected/ce8f090c-d792-47f3-b734-5c59d4762eb9-kube-api-access-j9vrp\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.974436 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.974510 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.974543 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.974574 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.974617 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce8f090c-d792-47f3-b734-5c59d4762eb9-httpd-run\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.974764 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce8f090c-d792-47f3-b734-5c59d4762eb9-logs\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.975119 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.975684 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce8f090c-d792-47f3-b734-5c59d4762eb9-logs\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.975775 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce8f090c-d792-47f3-b734-5c59d4762eb9-httpd-run\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.981506 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.984131 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.985347 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.985610 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.986750 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:06 crc kubenswrapper[5008]: I1126 22:58:06.992515 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9vrp\" (UniqueName: \"kubernetes.io/projected/ce8f090c-d792-47f3-b734-5c59d4762eb9-kube-api-access-j9vrp\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:07 crc kubenswrapper[5008]: I1126 22:58:07.007646 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:07 crc kubenswrapper[5008]: I1126 22:58:07.169793 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:07 crc kubenswrapper[5008]: I1126 22:58:07.470449 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:58:08 crc kubenswrapper[5008]: I1126 22:58:08.275554 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ce8f090c-d792-47f3-b734-5c59d4762eb9","Type":"ContainerStarted","Data":"61992d5edf293ba77db918c0721c63336f6906e5680ac86e6826ac4564804416"} Nov 26 22:58:09 crc kubenswrapper[5008]: I1126 22:58:09.290433 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ce8f090c-d792-47f3-b734-5c59d4762eb9","Type":"ContainerStarted","Data":"e842a3765e7e34807bc58bd4eb69f9e0b2c72a6c0ddf7c64191cf78d14375929"} Nov 26 22:58:09 crc kubenswrapper[5008]: I1126 22:58:09.291011 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ce8f090c-d792-47f3-b734-5c59d4762eb9","Type":"ContainerStarted","Data":"2daf9aff46f7216a2056082f139a2ead15d3f7e38625e5d6e285b9293bfa80a7"} Nov 26 22:58:09 crc kubenswrapper[5008]: I1126 22:58:09.333048 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.333025 podStartE2EDuration="3.333025s" podCreationTimestamp="2025-11-26 22:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:58:09.325867355 +0000 UTC m=+1164.738561397" watchObservedRunningTime="2025-11-26 22:58:09.333025 +0000 UTC m=+1164.745719012" Nov 26 22:58:17 crc kubenswrapper[5008]: I1126 22:58:17.170916 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:17 crc kubenswrapper[5008]: I1126 22:58:17.171703 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:17 crc kubenswrapper[5008]: I1126 22:58:17.252063 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:17 crc kubenswrapper[5008]: I1126 22:58:17.330564 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:17 crc kubenswrapper[5008]: I1126 22:58:17.371307 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:17 crc kubenswrapper[5008]: I1126 22:58:17.371350 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:19 crc kubenswrapper[5008]: I1126 22:58:19.183574 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:19 crc kubenswrapper[5008]: I1126 22:58:19.186897 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:20 crc kubenswrapper[5008]: I1126 22:58:20.428155 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-2nxfj"] Nov 26 22:58:20 crc kubenswrapper[5008]: I1126 22:58:20.434704 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-2nxfj"] Nov 26 22:58:20 crc kubenswrapper[5008]: I1126 22:58:20.472218 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance5a2c-account-delete-5s8sl"] Nov 26 22:58:20 crc kubenswrapper[5008]: I1126 22:58:20.473230 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5a2c-account-delete-5s8sl" Nov 26 22:58:20 crc kubenswrapper[5008]: I1126 22:58:20.488865 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance5a2c-account-delete-5s8sl"] Nov 26 22:58:20 crc kubenswrapper[5008]: E1126 22:58:20.549446 5008 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Nov 26 22:58:20 crc kubenswrapper[5008]: E1126 22:58:20.549659 5008 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-default-single-config-data: secret "glance-default-single-config-data" not found Nov 26 22:58:20 crc kubenswrapper[5008]: E1126 22:58:20.549759 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts podName:ce8f090c-d792-47f3-b734-5c59d4762eb9 nodeName:}" failed. No retries permitted until 2025-11-26 22:58:21.04973601 +0000 UTC m=+1176.462430012 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts") pod "glance-default-single-0" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9") : secret "glance-scripts" not found Nov 26 22:58:20 crc kubenswrapper[5008]: E1126 22:58:20.549826 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data podName:ce8f090c-d792-47f3-b734-5c59d4762eb9 nodeName:}" failed. No retries permitted until 2025-11-26 22:58:21.049801133 +0000 UTC m=+1176.462495145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data") pod "glance-default-single-0" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9") : secret "glance-default-single-config-data" not found Nov 26 22:58:20 crc kubenswrapper[5008]: I1126 22:58:20.568373 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:58:20 crc kubenswrapper[5008]: I1126 22:58:20.650550 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c6765f-b967-410e-b07d-75cfb742e780-operator-scripts\") pod \"glance5a2c-account-delete-5s8sl\" (UID: \"45c6765f-b967-410e-b07d-75cfb742e780\") " pod="glance-kuttl-tests/glance5a2c-account-delete-5s8sl" Nov 26 22:58:20 crc kubenswrapper[5008]: I1126 22:58:20.650628 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng69d\" (UniqueName: \"kubernetes.io/projected/45c6765f-b967-410e-b07d-75cfb742e780-kube-api-access-ng69d\") pod \"glance5a2c-account-delete-5s8sl\" (UID: \"45c6765f-b967-410e-b07d-75cfb742e780\") " pod="glance-kuttl-tests/glance5a2c-account-delete-5s8sl" Nov 26 22:58:20 crc kubenswrapper[5008]: I1126 22:58:20.751672 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c6765f-b967-410e-b07d-75cfb742e780-operator-scripts\") pod \"glance5a2c-account-delete-5s8sl\" (UID: \"45c6765f-b967-410e-b07d-75cfb742e780\") " pod="glance-kuttl-tests/glance5a2c-account-delete-5s8sl" Nov 26 22:58:20 crc kubenswrapper[5008]: I1126 22:58:20.751773 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng69d\" (UniqueName: \"kubernetes.io/projected/45c6765f-b967-410e-b07d-75cfb742e780-kube-api-access-ng69d\") pod \"glance5a2c-account-delete-5s8sl\" (UID: \"45c6765f-b967-410e-b07d-75cfb742e780\") " pod="glance-kuttl-tests/glance5a2c-account-delete-5s8sl" Nov 26 22:58:20 crc kubenswrapper[5008]: I1126 22:58:20.752529 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c6765f-b967-410e-b07d-75cfb742e780-operator-scripts\") pod \"glance5a2c-account-delete-5s8sl\" (UID: \"45c6765f-b967-410e-b07d-75cfb742e780\") " pod="glance-kuttl-tests/glance5a2c-account-delete-5s8sl" Nov 26 22:58:20 crc kubenswrapper[5008]: I1126 22:58:20.771173 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng69d\" (UniqueName: \"kubernetes.io/projected/45c6765f-b967-410e-b07d-75cfb742e780-kube-api-access-ng69d\") pod \"glance5a2c-account-delete-5s8sl\" (UID: \"45c6765f-b967-410e-b07d-75cfb742e780\") " pod="glance-kuttl-tests/glance5a2c-account-delete-5s8sl" Nov 26 22:58:20 crc kubenswrapper[5008]: I1126 22:58:20.794266 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5a2c-account-delete-5s8sl" Nov 26 22:58:21 crc kubenswrapper[5008]: E1126 22:58:21.056425 5008 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Nov 26 22:58:21 crc kubenswrapper[5008]: E1126 22:58:21.056746 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts podName:ce8f090c-d792-47f3-b734-5c59d4762eb9 nodeName:}" failed. No retries permitted until 2025-11-26 22:58:22.056731246 +0000 UTC m=+1177.469425248 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts") pod "glance-default-single-0" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9") : secret "glance-scripts" not found Nov 26 22:58:21 crc kubenswrapper[5008]: E1126 22:58:21.056506 5008 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-default-single-config-data: secret "glance-default-single-config-data" not found Nov 26 22:58:21 crc kubenswrapper[5008]: E1126 22:58:21.057013 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data podName:ce8f090c-d792-47f3-b734-5c59d4762eb9 nodeName:}" failed. No retries permitted until 2025-11-26 22:58:22.056977513 +0000 UTC m=+1177.469671535 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data") pod "glance-default-single-0" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9") : secret "glance-default-single-config-data" not found Nov 26 22:58:21 crc kubenswrapper[5008]: I1126 22:58:21.244520 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance5a2c-account-delete-5s8sl"] Nov 26 22:58:21 crc kubenswrapper[5008]: I1126 22:58:21.401307 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance5a2c-account-delete-5s8sl" event={"ID":"45c6765f-b967-410e-b07d-75cfb742e780","Type":"ContainerStarted","Data":"fce71689b5d4647939e54ac1138a95f7bbd5bf89ba9382ff858ba0b7e4a4a4b2"} Nov 26 22:58:21 crc kubenswrapper[5008]: I1126 22:58:21.401492 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="ce8f090c-d792-47f3-b734-5c59d4762eb9" containerName="glance-log" containerID="cri-o://2daf9aff46f7216a2056082f139a2ead15d3f7e38625e5d6e285b9293bfa80a7" gracePeriod=30 Nov 26 22:58:21 crc kubenswrapper[5008]: I1126 22:58:21.401530 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="ce8f090c-d792-47f3-b734-5c59d4762eb9" containerName="glance-httpd" containerID="cri-o://e842a3765e7e34807bc58bd4eb69f9e0b2c72a6c0ddf7c64191cf78d14375929" gracePeriod=30 Nov 26 22:58:21 crc kubenswrapper[5008]: I1126 22:58:21.532898 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1697aef7-c233-444e-bc97-5a0ac3cd5817" path="/var/lib/kubelet/pods/1697aef7-c233-444e-bc97-5a0ac3cd5817/volumes" Nov 26 22:58:22 crc kubenswrapper[5008]: E1126 22:58:22.069082 5008 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Nov 26 22:58:22 crc kubenswrapper[5008]: E1126 22:58:22.069520 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts podName:ce8f090c-d792-47f3-b734-5c59d4762eb9 nodeName:}" failed. No retries permitted until 2025-11-26 22:58:24.069493019 +0000 UTC m=+1179.482187051 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts") pod "glance-default-single-0" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9") : secret "glance-scripts" not found Nov 26 22:58:22 crc kubenswrapper[5008]: E1126 22:58:22.069089 5008 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-default-single-config-data: secret "glance-default-single-config-data" not found Nov 26 22:58:22 crc kubenswrapper[5008]: E1126 22:58:22.069594 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data podName:ce8f090c-d792-47f3-b734-5c59d4762eb9 nodeName:}" failed. No retries permitted until 2025-11-26 22:58:24.069576622 +0000 UTC m=+1179.482270614 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data") pod "glance-default-single-0" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9") : secret "glance-default-single-config-data" not found Nov 26 22:58:22 crc kubenswrapper[5008]: I1126 22:58:22.409946 5008 generic.go:334] "Generic (PLEG): container finished" podID="ce8f090c-d792-47f3-b734-5c59d4762eb9" containerID="2daf9aff46f7216a2056082f139a2ead15d3f7e38625e5d6e285b9293bfa80a7" exitCode=143 Nov 26 22:58:22 crc kubenswrapper[5008]: I1126 22:58:22.409997 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ce8f090c-d792-47f3-b734-5c59d4762eb9","Type":"ContainerDied","Data":"2daf9aff46f7216a2056082f139a2ead15d3f7e38625e5d6e285b9293bfa80a7"} Nov 26 22:58:22 crc kubenswrapper[5008]: I1126 22:58:22.411977 5008 generic.go:334] "Generic (PLEG): container finished" podID="45c6765f-b967-410e-b07d-75cfb742e780" containerID="44518e0d07613dc63e03812f587546c7f0bcaf8f56b777e92bf688080386b293" exitCode=0 Nov 26 22:58:22 crc kubenswrapper[5008]: I1126 22:58:22.412011 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance5a2c-account-delete-5s8sl" event={"ID":"45c6765f-b967-410e-b07d-75cfb742e780","Type":"ContainerDied","Data":"44518e0d07613dc63e03812f587546c7f0bcaf8f56b777e92bf688080386b293"} Nov 26 22:58:23 crc kubenswrapper[5008]: I1126 22:58:23.825195 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5a2c-account-delete-5s8sl" Nov 26 22:58:24 crc kubenswrapper[5008]: I1126 22:58:24.009744 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c6765f-b967-410e-b07d-75cfb742e780-operator-scripts\") pod \"45c6765f-b967-410e-b07d-75cfb742e780\" (UID: \"45c6765f-b967-410e-b07d-75cfb742e780\") " Nov 26 22:58:24 crc kubenswrapper[5008]: I1126 22:58:24.009819 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng69d\" (UniqueName: \"kubernetes.io/projected/45c6765f-b967-410e-b07d-75cfb742e780-kube-api-access-ng69d\") pod \"45c6765f-b967-410e-b07d-75cfb742e780\" (UID: \"45c6765f-b967-410e-b07d-75cfb742e780\") " Nov 26 22:58:24 crc kubenswrapper[5008]: I1126 22:58:24.010922 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45c6765f-b967-410e-b07d-75cfb742e780-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45c6765f-b967-410e-b07d-75cfb742e780" (UID: "45c6765f-b967-410e-b07d-75cfb742e780"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:58:24 crc kubenswrapper[5008]: I1126 22:58:24.015495 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45c6765f-b967-410e-b07d-75cfb742e780-kube-api-access-ng69d" (OuterVolumeSpecName: "kube-api-access-ng69d") pod "45c6765f-b967-410e-b07d-75cfb742e780" (UID: "45c6765f-b967-410e-b07d-75cfb742e780"). InnerVolumeSpecName "kube-api-access-ng69d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:58:24 crc kubenswrapper[5008]: I1126 22:58:24.111213 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c6765f-b967-410e-b07d-75cfb742e780-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:24 crc kubenswrapper[5008]: I1126 22:58:24.111244 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng69d\" (UniqueName: \"kubernetes.io/projected/45c6765f-b967-410e-b07d-75cfb742e780-kube-api-access-ng69d\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:24 crc kubenswrapper[5008]: E1126 22:58:24.111250 5008 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-default-single-config-data: secret "glance-default-single-config-data" not found Nov 26 22:58:24 crc kubenswrapper[5008]: E1126 22:58:24.111306 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data podName:ce8f090c-d792-47f3-b734-5c59d4762eb9 nodeName:}" failed. No retries permitted until 2025-11-26 22:58:28.111289926 +0000 UTC m=+1183.523983938 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data") pod "glance-default-single-0" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9") : secret "glance-default-single-config-data" not found Nov 26 22:58:24 crc kubenswrapper[5008]: E1126 22:58:24.111306 5008 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Nov 26 22:58:24 crc kubenswrapper[5008]: E1126 22:58:24.111346 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts podName:ce8f090c-d792-47f3-b734-5c59d4762eb9 nodeName:}" failed. No retries permitted until 2025-11-26 22:58:28.111333967 +0000 UTC m=+1183.524027979 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts") pod "glance-default-single-0" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9") : secret "glance-scripts" not found Nov 26 22:58:24 crc kubenswrapper[5008]: I1126 22:58:24.434825 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance5a2c-account-delete-5s8sl" event={"ID":"45c6765f-b967-410e-b07d-75cfb742e780","Type":"ContainerDied","Data":"fce71689b5d4647939e54ac1138a95f7bbd5bf89ba9382ff858ba0b7e4a4a4b2"} Nov 26 22:58:24 crc kubenswrapper[5008]: I1126 22:58:24.434864 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5a2c-account-delete-5s8sl" Nov 26 22:58:24 crc kubenswrapper[5008]: I1126 22:58:24.434865 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fce71689b5d4647939e54ac1138a95f7bbd5bf89ba9382ff858ba0b7e4a4a4b2" Nov 26 22:58:24 crc kubenswrapper[5008]: I1126 22:58:24.920925 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.022725 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data\") pod \"ce8f090c-d792-47f3-b734-5c59d4762eb9\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.022799 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ce8f090c-d792-47f3-b734-5c59d4762eb9\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.022858 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-internal-tls-certs\") pod \"ce8f090c-d792-47f3-b734-5c59d4762eb9\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.022936 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-combined-ca-bundle\") pod \"ce8f090c-d792-47f3-b734-5c59d4762eb9\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.022987 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce8f090c-d792-47f3-b734-5c59d4762eb9-httpd-run\") pod \"ce8f090c-d792-47f3-b734-5c59d4762eb9\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.023015 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts\") pod \"ce8f090c-d792-47f3-b734-5c59d4762eb9\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.023040 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce8f090c-d792-47f3-b734-5c59d4762eb9-logs\") pod \"ce8f090c-d792-47f3-b734-5c59d4762eb9\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.023078 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9vrp\" (UniqueName: \"kubernetes.io/projected/ce8f090c-d792-47f3-b734-5c59d4762eb9-kube-api-access-j9vrp\") pod \"ce8f090c-d792-47f3-b734-5c59d4762eb9\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.023140 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-public-tls-certs\") pod \"ce8f090c-d792-47f3-b734-5c59d4762eb9\" (UID: \"ce8f090c-d792-47f3-b734-5c59d4762eb9\") " Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.024022 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce8f090c-d792-47f3-b734-5c59d4762eb9-logs" (OuterVolumeSpecName: "logs") pod "ce8f090c-d792-47f3-b734-5c59d4762eb9" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.024603 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce8f090c-d792-47f3-b734-5c59d4762eb9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ce8f090c-d792-47f3-b734-5c59d4762eb9" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.027428 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "ce8f090c-d792-47f3-b734-5c59d4762eb9" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.027788 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts" (OuterVolumeSpecName: "scripts") pod "ce8f090c-d792-47f3-b734-5c59d4762eb9" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.029123 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8f090c-d792-47f3-b734-5c59d4762eb9-kube-api-access-j9vrp" (OuterVolumeSpecName: "kube-api-access-j9vrp") pod "ce8f090c-d792-47f3-b734-5c59d4762eb9" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9"). InnerVolumeSpecName "kube-api-access-j9vrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.048227 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce8f090c-d792-47f3-b734-5c59d4762eb9" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.058263 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ce8f090c-d792-47f3-b734-5c59d4762eb9" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.060464 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data" (OuterVolumeSpecName: "config-data") pod "ce8f090c-d792-47f3-b734-5c59d4762eb9" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.082624 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ce8f090c-d792-47f3-b734-5c59d4762eb9" (UID: "ce8f090c-d792-47f3-b734-5c59d4762eb9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.125038 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9vrp\" (UniqueName: \"kubernetes.io/projected/ce8f090c-d792-47f3-b734-5c59d4762eb9-kube-api-access-j9vrp\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.125084 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.125102 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.125142 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.125161 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.125178 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.125195 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce8f090c-d792-47f3-b734-5c59d4762eb9-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.125214 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce8f090c-d792-47f3-b734-5c59d4762eb9-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.125229 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce8f090c-d792-47f3-b734-5c59d4762eb9-logs\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.152880 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.226270 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.447617 5008 generic.go:334] "Generic (PLEG): container finished" podID="ce8f090c-d792-47f3-b734-5c59d4762eb9" containerID="e842a3765e7e34807bc58bd4eb69f9e0b2c72a6c0ddf7c64191cf78d14375929" exitCode=0 Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.447675 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ce8f090c-d792-47f3-b734-5c59d4762eb9","Type":"ContainerDied","Data":"e842a3765e7e34807bc58bd4eb69f9e0b2c72a6c0ddf7c64191cf78d14375929"} Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.447723 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ce8f090c-d792-47f3-b734-5c59d4762eb9","Type":"ContainerDied","Data":"61992d5edf293ba77db918c0721c63336f6906e5680ac86e6826ac4564804416"} Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.447753 5008 scope.go:117] "RemoveContainer" containerID="e842a3765e7e34807bc58bd4eb69f9e0b2c72a6c0ddf7c64191cf78d14375929" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.447758 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.486684 5008 scope.go:117] "RemoveContainer" containerID="2daf9aff46f7216a2056082f139a2ead15d3f7e38625e5d6e285b9293bfa80a7" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.507759 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-rdkjn"] Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.537592 5008 scope.go:117] "RemoveContainer" containerID="e842a3765e7e34807bc58bd4eb69f9e0b2c72a6c0ddf7c64191cf78d14375929" Nov 26 22:58:25 crc kubenswrapper[5008]: E1126 22:58:25.538867 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e842a3765e7e34807bc58bd4eb69f9e0b2c72a6c0ddf7c64191cf78d14375929\": container with ID starting with e842a3765e7e34807bc58bd4eb69f9e0b2c72a6c0ddf7c64191cf78d14375929 not found: ID does not exist" containerID="e842a3765e7e34807bc58bd4eb69f9e0b2c72a6c0ddf7c64191cf78d14375929" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.539114 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e842a3765e7e34807bc58bd4eb69f9e0b2c72a6c0ddf7c64191cf78d14375929"} err="failed to get container status \"e842a3765e7e34807bc58bd4eb69f9e0b2c72a6c0ddf7c64191cf78d14375929\": rpc error: code = NotFound desc = could not find container \"e842a3765e7e34807bc58bd4eb69f9e0b2c72a6c0ddf7c64191cf78d14375929\": container with ID starting with e842a3765e7e34807bc58bd4eb69f9e0b2c72a6c0ddf7c64191cf78d14375929 not found: ID does not exist" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.539524 5008 scope.go:117] "RemoveContainer" containerID="2daf9aff46f7216a2056082f139a2ead15d3f7e38625e5d6e285b9293bfa80a7" Nov 26 22:58:25 crc kubenswrapper[5008]: E1126 22:58:25.540717 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2daf9aff46f7216a2056082f139a2ead15d3f7e38625e5d6e285b9293bfa80a7\": container with ID starting with 2daf9aff46f7216a2056082f139a2ead15d3f7e38625e5d6e285b9293bfa80a7 not found: ID does not exist" containerID="2daf9aff46f7216a2056082f139a2ead15d3f7e38625e5d6e285b9293bfa80a7" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.541094 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2daf9aff46f7216a2056082f139a2ead15d3f7e38625e5d6e285b9293bfa80a7"} err="failed to get container status \"2daf9aff46f7216a2056082f139a2ead15d3f7e38625e5d6e285b9293bfa80a7\": rpc error: code = NotFound desc = could not find container \"2daf9aff46f7216a2056082f139a2ead15d3f7e38625e5d6e285b9293bfa80a7\": container with ID starting with 2daf9aff46f7216a2056082f139a2ead15d3f7e38625e5d6e285b9293bfa80a7 not found: ID does not exist" Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.553328 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-rdkjn"] Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.553369 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.559393 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-5a2c-account-create-update-w89b5"] Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.567690 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance5a2c-account-delete-5s8sl"] Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.574394 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.580425 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-5a2c-account-create-update-w89b5"] Nov 26 22:58:25 crc kubenswrapper[5008]: I1126 22:58:25.586413 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance5a2c-account-delete-5s8sl"] Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.322872 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-h6dk7"] Nov 26 22:58:27 crc kubenswrapper[5008]: E1126 22:58:27.323638 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8f090c-d792-47f3-b734-5c59d4762eb9" containerName="glance-httpd" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.323660 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8f090c-d792-47f3-b734-5c59d4762eb9" containerName="glance-httpd" Nov 26 22:58:27 crc kubenswrapper[5008]: E1126 22:58:27.323679 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c6765f-b967-410e-b07d-75cfb742e780" containerName="mariadb-account-delete" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.323692 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c6765f-b967-410e-b07d-75cfb742e780" containerName="mariadb-account-delete" Nov 26 22:58:27 crc kubenswrapper[5008]: E1126 22:58:27.323722 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8f090c-d792-47f3-b734-5c59d4762eb9" containerName="glance-log" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.323736 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8f090c-d792-47f3-b734-5c59d4762eb9" containerName="glance-log" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.323986 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="45c6765f-b967-410e-b07d-75cfb742e780" containerName="mariadb-account-delete" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.324009 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8f090c-d792-47f3-b734-5c59d4762eb9" containerName="glance-httpd" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.324027 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8f090c-d792-47f3-b734-5c59d4762eb9" containerName="glance-log" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.324807 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-h6dk7" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.340373 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc"] Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.341882 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.343769 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.361799 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-h6dk7"] Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.372306 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc"] Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.377522 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22e07121-401d-4a21-b04c-baf4f8d7e8f2-operator-scripts\") pod \"glance-0d4d-account-create-update-q5hsc\" (UID: \"22e07121-401d-4a21-b04c-baf4f8d7e8f2\") " pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.377593 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d62c944d-3e89-4287-ad8f-98cef6afa353-operator-scripts\") pod \"glance-db-create-h6dk7\" (UID: \"d62c944d-3e89-4287-ad8f-98cef6afa353\") " pod="glance-kuttl-tests/glance-db-create-h6dk7" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.377670 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqgxl\" (UniqueName: \"kubernetes.io/projected/d62c944d-3e89-4287-ad8f-98cef6afa353-kube-api-access-pqgxl\") pod \"glance-db-create-h6dk7\" (UID: \"d62c944d-3e89-4287-ad8f-98cef6afa353\") " pod="glance-kuttl-tests/glance-db-create-h6dk7" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.377818 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdw66\" (UniqueName: \"kubernetes.io/projected/22e07121-401d-4a21-b04c-baf4f8d7e8f2-kube-api-access-vdw66\") pod \"glance-0d4d-account-create-update-q5hsc\" (UID: \"22e07121-401d-4a21-b04c-baf4f8d7e8f2\") " pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.479156 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqgxl\" (UniqueName: \"kubernetes.io/projected/d62c944d-3e89-4287-ad8f-98cef6afa353-kube-api-access-pqgxl\") pod \"glance-db-create-h6dk7\" (UID: \"d62c944d-3e89-4287-ad8f-98cef6afa353\") " pod="glance-kuttl-tests/glance-db-create-h6dk7" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.479336 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdw66\" (UniqueName: \"kubernetes.io/projected/22e07121-401d-4a21-b04c-baf4f8d7e8f2-kube-api-access-vdw66\") pod \"glance-0d4d-account-create-update-q5hsc\" (UID: \"22e07121-401d-4a21-b04c-baf4f8d7e8f2\") " pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.479593 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22e07121-401d-4a21-b04c-baf4f8d7e8f2-operator-scripts\") pod \"glance-0d4d-account-create-update-q5hsc\" (UID: \"22e07121-401d-4a21-b04c-baf4f8d7e8f2\") " pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.479649 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d62c944d-3e89-4287-ad8f-98cef6afa353-operator-scripts\") pod \"glance-db-create-h6dk7\" (UID: \"d62c944d-3e89-4287-ad8f-98cef6afa353\") " pod="glance-kuttl-tests/glance-db-create-h6dk7" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.480873 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22e07121-401d-4a21-b04c-baf4f8d7e8f2-operator-scripts\") pod \"glance-0d4d-account-create-update-q5hsc\" (UID: \"22e07121-401d-4a21-b04c-baf4f8d7e8f2\") " pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.481117 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d62c944d-3e89-4287-ad8f-98cef6afa353-operator-scripts\") pod \"glance-db-create-h6dk7\" (UID: \"d62c944d-3e89-4287-ad8f-98cef6afa353\") " pod="glance-kuttl-tests/glance-db-create-h6dk7" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.507656 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqgxl\" (UniqueName: \"kubernetes.io/projected/d62c944d-3e89-4287-ad8f-98cef6afa353-kube-api-access-pqgxl\") pod \"glance-db-create-h6dk7\" (UID: \"d62c944d-3e89-4287-ad8f-98cef6afa353\") " pod="glance-kuttl-tests/glance-db-create-h6dk7" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.507723 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdw66\" (UniqueName: \"kubernetes.io/projected/22e07121-401d-4a21-b04c-baf4f8d7e8f2-kube-api-access-vdw66\") pod \"glance-0d4d-account-create-update-q5hsc\" (UID: \"22e07121-401d-4a21-b04c-baf4f8d7e8f2\") " pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.545207 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06ae4ab1-219f-4500-b1ea-0db090c7a3ac" path="/var/lib/kubelet/pods/06ae4ab1-219f-4500-b1ea-0db090c7a3ac/volumes" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.546401 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45c6765f-b967-410e-b07d-75cfb742e780" path="/var/lib/kubelet/pods/45c6765f-b967-410e-b07d-75cfb742e780/volumes" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.547518 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e09b02-d156-4def-a546-5cfe04eda5e2" path="/var/lib/kubelet/pods/65e09b02-d156-4def-a546-5cfe04eda5e2/volumes" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.550074 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce8f090c-d792-47f3-b734-5c59d4762eb9" path="/var/lib/kubelet/pods/ce8f090c-d792-47f3-b734-5c59d4762eb9/volumes" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.661750 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-h6dk7" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.674585 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" Nov 26 22:58:27 crc kubenswrapper[5008]: I1126 22:58:27.946356 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-h6dk7"] Nov 26 22:58:28 crc kubenswrapper[5008]: I1126 22:58:28.224194 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc"] Nov 26 22:58:28 crc kubenswrapper[5008]: W1126 22:58:28.236344 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22e07121_401d_4a21_b04c_baf4f8d7e8f2.slice/crio-6dde6a948f77ea6f3f3c8e1926a006981fac19426832489795eab7b4c2cdee86 WatchSource:0}: Error finding container 6dde6a948f77ea6f3f3c8e1926a006981fac19426832489795eab7b4c2cdee86: Status 404 returned error can't find the container with id 6dde6a948f77ea6f3f3c8e1926a006981fac19426832489795eab7b4c2cdee86 Nov 26 22:58:28 crc kubenswrapper[5008]: I1126 22:58:28.477208 5008 generic.go:334] "Generic (PLEG): container finished" podID="d62c944d-3e89-4287-ad8f-98cef6afa353" containerID="add3e7e0c6aff07f351114f48fefd750e6f566894de1ec122bc1fb0e7a05b9c3" exitCode=0 Nov 26 22:58:28 crc kubenswrapper[5008]: I1126 22:58:28.477353 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-h6dk7" event={"ID":"d62c944d-3e89-4287-ad8f-98cef6afa353","Type":"ContainerDied","Data":"add3e7e0c6aff07f351114f48fefd750e6f566894de1ec122bc1fb0e7a05b9c3"} Nov 26 22:58:28 crc kubenswrapper[5008]: I1126 22:58:28.477432 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-h6dk7" event={"ID":"d62c944d-3e89-4287-ad8f-98cef6afa353","Type":"ContainerStarted","Data":"bf3a2376fcc32cdb187e7d9914ce61a3a5ac3272b2a8558b6c8806d5efcbaf98"} Nov 26 22:58:28 crc kubenswrapper[5008]: I1126 22:58:28.479936 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" event={"ID":"22e07121-401d-4a21-b04c-baf4f8d7e8f2","Type":"ContainerStarted","Data":"408f543f03320396668e32f4dfe777d6acc316caa4456ca012665249d74c51d9"} Nov 26 22:58:28 crc kubenswrapper[5008]: I1126 22:58:28.480279 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" event={"ID":"22e07121-401d-4a21-b04c-baf4f8d7e8f2","Type":"ContainerStarted","Data":"6dde6a948f77ea6f3f3c8e1926a006981fac19426832489795eab7b4c2cdee86"} Nov 26 22:58:28 crc kubenswrapper[5008]: I1126 22:58:28.508986 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" podStartSLOduration=1.5089385260000001 podStartE2EDuration="1.508938526s" podCreationTimestamp="2025-11-26 22:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:58:28.506286933 +0000 UTC m=+1183.918980975" watchObservedRunningTime="2025-11-26 22:58:28.508938526 +0000 UTC m=+1183.921632568" Nov 26 22:58:29 crc kubenswrapper[5008]: I1126 22:58:29.489077 5008 generic.go:334] "Generic (PLEG): container finished" podID="22e07121-401d-4a21-b04c-baf4f8d7e8f2" containerID="408f543f03320396668e32f4dfe777d6acc316caa4456ca012665249d74c51d9" exitCode=0 Nov 26 22:58:29 crc kubenswrapper[5008]: I1126 22:58:29.489156 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" event={"ID":"22e07121-401d-4a21-b04c-baf4f8d7e8f2","Type":"ContainerDied","Data":"408f543f03320396668e32f4dfe777d6acc316caa4456ca012665249d74c51d9"} Nov 26 22:58:29 crc kubenswrapper[5008]: I1126 22:58:29.871278 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-h6dk7" Nov 26 22:58:29 crc kubenswrapper[5008]: I1126 22:58:29.935998 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d62c944d-3e89-4287-ad8f-98cef6afa353-operator-scripts\") pod \"d62c944d-3e89-4287-ad8f-98cef6afa353\" (UID: \"d62c944d-3e89-4287-ad8f-98cef6afa353\") " Nov 26 22:58:29 crc kubenswrapper[5008]: I1126 22:58:29.936350 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqgxl\" (UniqueName: \"kubernetes.io/projected/d62c944d-3e89-4287-ad8f-98cef6afa353-kube-api-access-pqgxl\") pod \"d62c944d-3e89-4287-ad8f-98cef6afa353\" (UID: \"d62c944d-3e89-4287-ad8f-98cef6afa353\") " Nov 26 22:58:29 crc kubenswrapper[5008]: I1126 22:58:29.937202 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62c944d-3e89-4287-ad8f-98cef6afa353-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d62c944d-3e89-4287-ad8f-98cef6afa353" (UID: "d62c944d-3e89-4287-ad8f-98cef6afa353"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:58:29 crc kubenswrapper[5008]: I1126 22:58:29.942815 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62c944d-3e89-4287-ad8f-98cef6afa353-kube-api-access-pqgxl" (OuterVolumeSpecName: "kube-api-access-pqgxl") pod "d62c944d-3e89-4287-ad8f-98cef6afa353" (UID: "d62c944d-3e89-4287-ad8f-98cef6afa353"). InnerVolumeSpecName "kube-api-access-pqgxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:58:30 crc kubenswrapper[5008]: I1126 22:58:30.038954 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d62c944d-3e89-4287-ad8f-98cef6afa353-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:30 crc kubenswrapper[5008]: I1126 22:58:30.039042 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqgxl\" (UniqueName: \"kubernetes.io/projected/d62c944d-3e89-4287-ad8f-98cef6afa353-kube-api-access-pqgxl\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:30 crc kubenswrapper[5008]: I1126 22:58:30.501761 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-h6dk7" event={"ID":"d62c944d-3e89-4287-ad8f-98cef6afa353","Type":"ContainerDied","Data":"bf3a2376fcc32cdb187e7d9914ce61a3a5ac3272b2a8558b6c8806d5efcbaf98"} Nov 26 22:58:30 crc kubenswrapper[5008]: I1126 22:58:30.501860 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf3a2376fcc32cdb187e7d9914ce61a3a5ac3272b2a8558b6c8806d5efcbaf98" Nov 26 22:58:30 crc kubenswrapper[5008]: I1126 22:58:30.502032 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-h6dk7" Nov 26 22:58:30 crc kubenswrapper[5008]: I1126 22:58:30.888132 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" Nov 26 22:58:30 crc kubenswrapper[5008]: I1126 22:58:30.956507 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdw66\" (UniqueName: \"kubernetes.io/projected/22e07121-401d-4a21-b04c-baf4f8d7e8f2-kube-api-access-vdw66\") pod \"22e07121-401d-4a21-b04c-baf4f8d7e8f2\" (UID: \"22e07121-401d-4a21-b04c-baf4f8d7e8f2\") " Nov 26 22:58:30 crc kubenswrapper[5008]: I1126 22:58:30.956685 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22e07121-401d-4a21-b04c-baf4f8d7e8f2-operator-scripts\") pod \"22e07121-401d-4a21-b04c-baf4f8d7e8f2\" (UID: \"22e07121-401d-4a21-b04c-baf4f8d7e8f2\") " Nov 26 22:58:30 crc kubenswrapper[5008]: I1126 22:58:30.957440 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22e07121-401d-4a21-b04c-baf4f8d7e8f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22e07121-401d-4a21-b04c-baf4f8d7e8f2" (UID: "22e07121-401d-4a21-b04c-baf4f8d7e8f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:58:30 crc kubenswrapper[5008]: I1126 22:58:30.962106 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22e07121-401d-4a21-b04c-baf4f8d7e8f2-kube-api-access-vdw66" (OuterVolumeSpecName: "kube-api-access-vdw66") pod "22e07121-401d-4a21-b04c-baf4f8d7e8f2" (UID: "22e07121-401d-4a21-b04c-baf4f8d7e8f2"). InnerVolumeSpecName "kube-api-access-vdw66". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:58:31 crc kubenswrapper[5008]: I1126 22:58:31.057912 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdw66\" (UniqueName: \"kubernetes.io/projected/22e07121-401d-4a21-b04c-baf4f8d7e8f2-kube-api-access-vdw66\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:31 crc kubenswrapper[5008]: I1126 22:58:31.058047 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22e07121-401d-4a21-b04c-baf4f8d7e8f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:31 crc kubenswrapper[5008]: I1126 22:58:31.515115 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" event={"ID":"22e07121-401d-4a21-b04c-baf4f8d7e8f2","Type":"ContainerDied","Data":"6dde6a948f77ea6f3f3c8e1926a006981fac19426832489795eab7b4c2cdee86"} Nov 26 22:58:31 crc kubenswrapper[5008]: I1126 22:58:31.515541 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dde6a948f77ea6f3f3c8e1926a006981fac19426832489795eab7b4c2cdee86" Nov 26 22:58:31 crc kubenswrapper[5008]: I1126 22:58:31.515202 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.624468 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-ntc6s"] Nov 26 22:58:32 crc kubenswrapper[5008]: E1126 22:58:32.624908 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62c944d-3e89-4287-ad8f-98cef6afa353" containerName="mariadb-database-create" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.624932 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62c944d-3e89-4287-ad8f-98cef6afa353" containerName="mariadb-database-create" Nov 26 22:58:32 crc kubenswrapper[5008]: E1126 22:58:32.624953 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22e07121-401d-4a21-b04c-baf4f8d7e8f2" containerName="mariadb-account-create-update" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.624996 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e07121-401d-4a21-b04c-baf4f8d7e8f2" containerName="mariadb-account-create-update" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.625234 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62c944d-3e89-4287-ad8f-98cef6afa353" containerName="mariadb-database-create" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.625282 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="22e07121-401d-4a21-b04c-baf4f8d7e8f2" containerName="mariadb-account-create-update" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.626029 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ntc6s" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.628817 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-hbj5g" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.635721 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.641040 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ntc6s"] Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.690183 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttt8r\" (UniqueName: \"kubernetes.io/projected/6f02c876-b9c0-4f21-9e7c-678d0a032155-kube-api-access-ttt8r\") pod \"glance-db-sync-ntc6s\" (UID: \"6f02c876-b9c0-4f21-9e7c-678d0a032155\") " pod="glance-kuttl-tests/glance-db-sync-ntc6s" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.690251 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f02c876-b9c0-4f21-9e7c-678d0a032155-db-sync-config-data\") pod \"glance-db-sync-ntc6s\" (UID: \"6f02c876-b9c0-4f21-9e7c-678d0a032155\") " pod="glance-kuttl-tests/glance-db-sync-ntc6s" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.690346 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f02c876-b9c0-4f21-9e7c-678d0a032155-config-data\") pod \"glance-db-sync-ntc6s\" (UID: \"6f02c876-b9c0-4f21-9e7c-678d0a032155\") " pod="glance-kuttl-tests/glance-db-sync-ntc6s" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.791910 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttt8r\" (UniqueName: \"kubernetes.io/projected/6f02c876-b9c0-4f21-9e7c-678d0a032155-kube-api-access-ttt8r\") pod \"glance-db-sync-ntc6s\" (UID: \"6f02c876-b9c0-4f21-9e7c-678d0a032155\") " pod="glance-kuttl-tests/glance-db-sync-ntc6s" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.792023 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f02c876-b9c0-4f21-9e7c-678d0a032155-db-sync-config-data\") pod \"glance-db-sync-ntc6s\" (UID: \"6f02c876-b9c0-4f21-9e7c-678d0a032155\") " pod="glance-kuttl-tests/glance-db-sync-ntc6s" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.792113 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f02c876-b9c0-4f21-9e7c-678d0a032155-config-data\") pod \"glance-db-sync-ntc6s\" (UID: \"6f02c876-b9c0-4f21-9e7c-678d0a032155\") " pod="glance-kuttl-tests/glance-db-sync-ntc6s" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.798006 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f02c876-b9c0-4f21-9e7c-678d0a032155-db-sync-config-data\") pod \"glance-db-sync-ntc6s\" (UID: \"6f02c876-b9c0-4f21-9e7c-678d0a032155\") " pod="glance-kuttl-tests/glance-db-sync-ntc6s" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.798539 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f02c876-b9c0-4f21-9e7c-678d0a032155-config-data\") pod \"glance-db-sync-ntc6s\" (UID: \"6f02c876-b9c0-4f21-9e7c-678d0a032155\") " pod="glance-kuttl-tests/glance-db-sync-ntc6s" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.830717 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttt8r\" (UniqueName: \"kubernetes.io/projected/6f02c876-b9c0-4f21-9e7c-678d0a032155-kube-api-access-ttt8r\") pod \"glance-db-sync-ntc6s\" (UID: \"6f02c876-b9c0-4f21-9e7c-678d0a032155\") " pod="glance-kuttl-tests/glance-db-sync-ntc6s" Nov 26 22:58:32 crc kubenswrapper[5008]: I1126 22:58:32.948043 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ntc6s" Nov 26 22:58:33 crc kubenswrapper[5008]: I1126 22:58:33.419348 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ntc6s"] Nov 26 22:58:33 crc kubenswrapper[5008]: I1126 22:58:33.532621 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ntc6s" event={"ID":"6f02c876-b9c0-4f21-9e7c-678d0a032155","Type":"ContainerStarted","Data":"9ba4e926b0d81c711de2905543e9631c32d3d5efbcb316aefcf847e40d7e1eb3"} Nov 26 22:58:34 crc kubenswrapper[5008]: I1126 22:58:34.544538 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ntc6s" event={"ID":"6f02c876-b9c0-4f21-9e7c-678d0a032155","Type":"ContainerStarted","Data":"654b8ed6b96303cd37493288703c948f78a1ce2ee0b8ada6171e68ff6753e584"} Nov 26 22:58:34 crc kubenswrapper[5008]: I1126 22:58:34.567348 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-ntc6s" podStartSLOduration=2.567322974 podStartE2EDuration="2.567322974s" podCreationTimestamp="2025-11-26 22:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:58:34.55924411 +0000 UTC m=+1189.971938122" watchObservedRunningTime="2025-11-26 22:58:34.567322974 +0000 UTC m=+1189.980016996" Nov 26 22:58:37 crc kubenswrapper[5008]: I1126 22:58:37.590198 5008 generic.go:334] "Generic (PLEG): container finished" podID="6f02c876-b9c0-4f21-9e7c-678d0a032155" containerID="654b8ed6b96303cd37493288703c948f78a1ce2ee0b8ada6171e68ff6753e584" exitCode=0 Nov 26 22:58:37 crc kubenswrapper[5008]: I1126 22:58:37.591729 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ntc6s" event={"ID":"6f02c876-b9c0-4f21-9e7c-678d0a032155","Type":"ContainerDied","Data":"654b8ed6b96303cd37493288703c948f78a1ce2ee0b8ada6171e68ff6753e584"} Nov 26 22:58:39 crc kubenswrapper[5008]: I1126 22:58:39.003162 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ntc6s" Nov 26 22:58:39 crc kubenswrapper[5008]: I1126 22:58:39.091319 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f02c876-b9c0-4f21-9e7c-678d0a032155-db-sync-config-data\") pod \"6f02c876-b9c0-4f21-9e7c-678d0a032155\" (UID: \"6f02c876-b9c0-4f21-9e7c-678d0a032155\") " Nov 26 22:58:39 crc kubenswrapper[5008]: I1126 22:58:39.091518 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttt8r\" (UniqueName: \"kubernetes.io/projected/6f02c876-b9c0-4f21-9e7c-678d0a032155-kube-api-access-ttt8r\") pod \"6f02c876-b9c0-4f21-9e7c-678d0a032155\" (UID: \"6f02c876-b9c0-4f21-9e7c-678d0a032155\") " Nov 26 22:58:39 crc kubenswrapper[5008]: I1126 22:58:39.091568 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f02c876-b9c0-4f21-9e7c-678d0a032155-config-data\") pod \"6f02c876-b9c0-4f21-9e7c-678d0a032155\" (UID: \"6f02c876-b9c0-4f21-9e7c-678d0a032155\") " Nov 26 22:58:39 crc kubenswrapper[5008]: I1126 22:58:39.099481 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f02c876-b9c0-4f21-9e7c-678d0a032155-kube-api-access-ttt8r" (OuterVolumeSpecName: "kube-api-access-ttt8r") pod "6f02c876-b9c0-4f21-9e7c-678d0a032155" (UID: "6f02c876-b9c0-4f21-9e7c-678d0a032155"). InnerVolumeSpecName "kube-api-access-ttt8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:58:39 crc kubenswrapper[5008]: I1126 22:58:39.099499 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f02c876-b9c0-4f21-9e7c-678d0a032155-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6f02c876-b9c0-4f21-9e7c-678d0a032155" (UID: "6f02c876-b9c0-4f21-9e7c-678d0a032155"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:39 crc kubenswrapper[5008]: I1126 22:58:39.142311 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f02c876-b9c0-4f21-9e7c-678d0a032155-config-data" (OuterVolumeSpecName: "config-data") pod "6f02c876-b9c0-4f21-9e7c-678d0a032155" (UID: "6f02c876-b9c0-4f21-9e7c-678d0a032155"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:39 crc kubenswrapper[5008]: I1126 22:58:39.192880 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttt8r\" (UniqueName: \"kubernetes.io/projected/6f02c876-b9c0-4f21-9e7c-678d0a032155-kube-api-access-ttt8r\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:39 crc kubenswrapper[5008]: I1126 22:58:39.192913 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f02c876-b9c0-4f21-9e7c-678d0a032155-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:39 crc kubenswrapper[5008]: I1126 22:58:39.192923 5008 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f02c876-b9c0-4f21-9e7c-678d0a032155-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:39 crc kubenswrapper[5008]: I1126 22:58:39.617284 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ntc6s" event={"ID":"6f02c876-b9c0-4f21-9e7c-678d0a032155","Type":"ContainerDied","Data":"9ba4e926b0d81c711de2905543e9631c32d3d5efbcb316aefcf847e40d7e1eb3"} Nov 26 22:58:39 crc kubenswrapper[5008]: I1126 22:58:39.617603 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ba4e926b0d81c711de2905543e9631c32d3d5efbcb316aefcf847e40d7e1eb3" Nov 26 22:58:39 crc kubenswrapper[5008]: I1126 22:58:39.617670 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ntc6s" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.780742 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 22:58:40 crc kubenswrapper[5008]: E1126 22:58:40.781223 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f02c876-b9c0-4f21-9e7c-678d0a032155" containerName="glance-db-sync" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.781250 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f02c876-b9c0-4f21-9e7c-678d0a032155" containerName="glance-db-sync" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.781498 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f02c876-b9c0-4f21-9e7c-678d0a032155" containerName="glance-db-sync" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.783281 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.784720 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.785888 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-hbj5g" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.787029 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.797846 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.918429 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-run\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.918618 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.918687 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-sys\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.918748 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.918767 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-scripts\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.918848 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-dev\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.918874 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlgjv\" (UniqueName: \"kubernetes.io/projected/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-kube-api-access-xlgjv\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.918903 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.918940 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-config-data\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.919168 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.919239 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.919273 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.919302 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:40 crc kubenswrapper[5008]: I1126 22:58:40.919363 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-logs\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.020601 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-dev\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.020862 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgjv\" (UniqueName: \"kubernetes.io/projected/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-kube-api-access-xlgjv\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.020960 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.020746 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-dev\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021069 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-config-data\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021351 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021536 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021582 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021457 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021670 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021600 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021713 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021761 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021777 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-logs\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021830 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-run\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021901 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021955 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-sys\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.022002 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-sys\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.022011 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-logs\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.021958 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-run\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.022109 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.022159 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-scripts\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.022227 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.022282 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.022417 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.031130 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-scripts\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.044128 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-config-data\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.045729 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlgjv\" (UniqueName: \"kubernetes.io/projected/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-kube-api-access-xlgjv\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.054116 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.058739 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.061128 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.067531 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.085330 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.086371 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.111728 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.123591 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b6531e4-57d7-4174-81b3-56a2bc515001-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.123722 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.123805 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6531e4-57d7-4174-81b3-56a2bc515001-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.123895 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.123982 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-sys\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.124076 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b6531e4-57d7-4174-81b3-56a2bc515001-logs\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.124180 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.124257 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-dev\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.124334 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn2ch\" (UniqueName: \"kubernetes.io/projected/0b6531e4-57d7-4174-81b3-56a2bc515001-kube-api-access-mn2ch\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.124413 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.124491 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-run\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.124566 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6531e4-57d7-4174-81b3-56a2bc515001-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.124648 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.124740 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.225858 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226217 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b6531e4-57d7-4174-81b3-56a2bc515001-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226251 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226273 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6531e4-57d7-4174-81b3-56a2bc515001-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226304 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226324 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-sys\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226345 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b6531e4-57d7-4174-81b3-56a2bc515001-logs\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226397 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226418 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-dev\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226441 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn2ch\" (UniqueName: \"kubernetes.io/projected/0b6531e4-57d7-4174-81b3-56a2bc515001-kube-api-access-mn2ch\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226463 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226488 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-run\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226507 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6531e4-57d7-4174-81b3-56a2bc515001-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226535 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.227108 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b6531e4-57d7-4174-81b3-56a2bc515001-logs\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.227306 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.226153 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.227430 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.227470 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-dev\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.227625 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.227726 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b6531e4-57d7-4174-81b3-56a2bc515001-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.227778 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-sys\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.228020 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-run\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.228058 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.229237 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.233259 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6531e4-57d7-4174-81b3-56a2bc515001-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.238694 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6531e4-57d7-4174-81b3-56a2bc515001-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.247912 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn2ch\" (UniqueName: \"kubernetes.io/projected/0b6531e4-57d7-4174-81b3-56a2bc515001-kube-api-access-mn2ch\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.253470 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.253938 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.483939 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.619328 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.740000 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:58:41 crc kubenswrapper[5008]: W1126 22:58:41.750063 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b6531e4_57d7_4174_81b3_56a2bc515001.slice/crio-7ff2823d07ab442df1ab05108a499512649a225f84b923f29b611177bef54175 WatchSource:0}: Error finding container 7ff2823d07ab442df1ab05108a499512649a225f84b923f29b611177bef54175: Status 404 returned error can't find the container with id 7ff2823d07ab442df1ab05108a499512649a225f84b923f29b611177bef54175 Nov 26 22:58:41 crc kubenswrapper[5008]: I1126 22:58:41.873427 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:58:42 crc kubenswrapper[5008]: I1126 22:58:42.646858 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0b6531e4-57d7-4174-81b3-56a2bc515001","Type":"ContainerStarted","Data":"2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390"} Nov 26 22:58:42 crc kubenswrapper[5008]: I1126 22:58:42.647341 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0b6531e4-57d7-4174-81b3-56a2bc515001","Type":"ContainerStarted","Data":"30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b"} Nov 26 22:58:42 crc kubenswrapper[5008]: I1126 22:58:42.647365 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0b6531e4-57d7-4174-81b3-56a2bc515001","Type":"ContainerStarted","Data":"00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead"} Nov 26 22:58:42 crc kubenswrapper[5008]: I1126 22:58:42.647384 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0b6531e4-57d7-4174-81b3-56a2bc515001","Type":"ContainerStarted","Data":"7ff2823d07ab442df1ab05108a499512649a225f84b923f29b611177bef54175"} Nov 26 22:58:42 crc kubenswrapper[5008]: I1126 22:58:42.647322 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerName="glance-httpd" containerID="cri-o://30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b" gracePeriod=30 Nov 26 22:58:42 crc kubenswrapper[5008]: I1126 22:58:42.647232 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerName="glance-api" containerID="cri-o://2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390" gracePeriod=30 Nov 26 22:58:42 crc kubenswrapper[5008]: I1126 22:58:42.647163 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerName="glance-log" containerID="cri-o://00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead" gracePeriod=30 Nov 26 22:58:42 crc kubenswrapper[5008]: I1126 22:58:42.651432 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3236fda8-50c3-4d65-9c07-3f9817cc3a9f","Type":"ContainerStarted","Data":"443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d"} Nov 26 22:58:42 crc kubenswrapper[5008]: I1126 22:58:42.651491 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3236fda8-50c3-4d65-9c07-3f9817cc3a9f","Type":"ContainerStarted","Data":"d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39"} Nov 26 22:58:42 crc kubenswrapper[5008]: I1126 22:58:42.651511 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3236fda8-50c3-4d65-9c07-3f9817cc3a9f","Type":"ContainerStarted","Data":"d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577"} Nov 26 22:58:42 crc kubenswrapper[5008]: I1126 22:58:42.651530 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3236fda8-50c3-4d65-9c07-3f9817cc3a9f","Type":"ContainerStarted","Data":"bc8d5514c192c9c0e364941828da412947efe2cb86ccc767040f729ab241a100"} Nov 26 22:58:42 crc kubenswrapper[5008]: I1126 22:58:42.700188 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.7001623439999998 podStartE2EDuration="2.700162344s" podCreationTimestamp="2025-11-26 22:58:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:58:42.693749162 +0000 UTC m=+1198.106443164" watchObservedRunningTime="2025-11-26 22:58:42.700162344 +0000 UTC m=+1198.112856376" Nov 26 22:58:42 crc kubenswrapper[5008]: I1126 22:58:42.738559 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.7385295689999998 podStartE2EDuration="2.738529569s" podCreationTimestamp="2025-11-26 22:58:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:58:42.731014043 +0000 UTC m=+1198.143708105" watchObservedRunningTime="2025-11-26 22:58:42.738529569 +0000 UTC m=+1198.151223611" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.151855 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.286228 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-run\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.286585 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-run" (OuterVolumeSpecName: "run") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.286665 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b6531e4-57d7-4174-81b3-56a2bc515001-httpd-run\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.286944 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b6531e4-57d7-4174-81b3-56a2bc515001-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.286776 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287108 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287133 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6531e4-57d7-4174-81b3-56a2bc515001-config-data\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287520 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6531e4-57d7-4174-81b3-56a2bc515001-scripts\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287555 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-etc-nvme\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287578 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-lib-modules\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287599 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn2ch\" (UniqueName: \"kubernetes.io/projected/0b6531e4-57d7-4174-81b3-56a2bc515001-kube-api-access-mn2ch\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287644 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-sys\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287666 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-dev\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287692 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287714 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-var-locks-brick\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287732 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-etc-iscsi\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287736 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287750 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b6531e4-57d7-4174-81b3-56a2bc515001-logs\") pod \"0b6531e4-57d7-4174-81b3-56a2bc515001\" (UID: \"0b6531e4-57d7-4174-81b3-56a2bc515001\") " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287765 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287792 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287771 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-sys" (OuterVolumeSpecName: "sys") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.287788 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-dev" (OuterVolumeSpecName: "dev") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.288043 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.288055 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.288064 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-sys\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.288072 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-dev\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.288080 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.288091 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.288100 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0b6531e4-57d7-4174-81b3-56a2bc515001-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.288109 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b6531e4-57d7-4174-81b3-56a2bc515001-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.288137 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b6531e4-57d7-4174-81b3-56a2bc515001-logs" (OuterVolumeSpecName: "logs") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.292635 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b6531e4-57d7-4174-81b3-56a2bc515001-scripts" (OuterVolumeSpecName: "scripts") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.293159 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.293280 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b6531e4-57d7-4174-81b3-56a2bc515001-kube-api-access-mn2ch" (OuterVolumeSpecName: "kube-api-access-mn2ch") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "kube-api-access-mn2ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.293346 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.373041 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b6531e4-57d7-4174-81b3-56a2bc515001-config-data" (OuterVolumeSpecName: "config-data") pod "0b6531e4-57d7-4174-81b3-56a2bc515001" (UID: "0b6531e4-57d7-4174-81b3-56a2bc515001"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.396156 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.396198 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.396214 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6531e4-57d7-4174-81b3-56a2bc515001-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.396225 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6531e4-57d7-4174-81b3-56a2bc515001-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.396238 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn2ch\" (UniqueName: \"kubernetes.io/projected/0b6531e4-57d7-4174-81b3-56a2bc515001-kube-api-access-mn2ch\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.396251 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b6531e4-57d7-4174-81b3-56a2bc515001-logs\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.410765 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.415143 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.496782 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.496817 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.663923 5008 generic.go:334] "Generic (PLEG): container finished" podID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerID="2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390" exitCode=143 Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.663996 5008 generic.go:334] "Generic (PLEG): container finished" podID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerID="30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b" exitCode=143 Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.664013 5008 generic.go:334] "Generic (PLEG): container finished" podID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerID="00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead" exitCode=143 Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.664042 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.664069 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0b6531e4-57d7-4174-81b3-56a2bc515001","Type":"ContainerDied","Data":"2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390"} Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.664177 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0b6531e4-57d7-4174-81b3-56a2bc515001","Type":"ContainerDied","Data":"30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b"} Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.664206 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0b6531e4-57d7-4174-81b3-56a2bc515001","Type":"ContainerDied","Data":"00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead"} Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.664225 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0b6531e4-57d7-4174-81b3-56a2bc515001","Type":"ContainerDied","Data":"7ff2823d07ab442df1ab05108a499512649a225f84b923f29b611177bef54175"} Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.664260 5008 scope.go:117] "RemoveContainer" containerID="2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.701063 5008 scope.go:117] "RemoveContainer" containerID="30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.708586 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.713929 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.733533 5008 scope.go:117] "RemoveContainer" containerID="00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.757229 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:58:43 crc kubenswrapper[5008]: E1126 22:58:43.758682 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerName="glance-httpd" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.758706 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerName="glance-httpd" Nov 26 22:58:43 crc kubenswrapper[5008]: E1126 22:58:43.758725 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerName="glance-log" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.758733 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerName="glance-log" Nov 26 22:58:43 crc kubenswrapper[5008]: E1126 22:58:43.758760 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerName="glance-api" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.758770 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerName="glance-api" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.758937 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerName="glance-log" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.758950 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerName="glance-httpd" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.759001 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6531e4-57d7-4174-81b3-56a2bc515001" containerName="glance-api" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.760486 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.763533 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.786958 5008 scope.go:117] "RemoveContainer" containerID="2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.787435 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:58:43 crc kubenswrapper[5008]: E1126 22:58:43.791097 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390\": container with ID starting with 2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390 not found: ID does not exist" containerID="2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.791142 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390"} err="failed to get container status \"2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390\": rpc error: code = NotFound desc = could not find container \"2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390\": container with ID starting with 2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390 not found: ID does not exist" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.791171 5008 scope.go:117] "RemoveContainer" containerID="30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b" Nov 26 22:58:43 crc kubenswrapper[5008]: E1126 22:58:43.791788 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b\": container with ID starting with 30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b not found: ID does not exist" containerID="30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.791861 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b"} err="failed to get container status \"30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b\": rpc error: code = NotFound desc = could not find container \"30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b\": container with ID starting with 30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b not found: ID does not exist" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.791895 5008 scope.go:117] "RemoveContainer" containerID="00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead" Nov 26 22:58:43 crc kubenswrapper[5008]: E1126 22:58:43.799452 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead\": container with ID starting with 00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead not found: ID does not exist" containerID="00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.799580 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead"} err="failed to get container status \"00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead\": rpc error: code = NotFound desc = could not find container \"00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead\": container with ID starting with 00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead not found: ID does not exist" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.799698 5008 scope.go:117] "RemoveContainer" containerID="2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.800321 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390"} err="failed to get container status \"2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390\": rpc error: code = NotFound desc = could not find container \"2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390\": container with ID starting with 2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390 not found: ID does not exist" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.800467 5008 scope.go:117] "RemoveContainer" containerID="30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.801083 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b"} err="failed to get container status \"30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b\": rpc error: code = NotFound desc = could not find container \"30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b\": container with ID starting with 30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b not found: ID does not exist" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.801258 5008 scope.go:117] "RemoveContainer" containerID="00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.801572 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead"} err="failed to get container status \"00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead\": rpc error: code = NotFound desc = could not find container \"00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead\": container with ID starting with 00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead not found: ID does not exist" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.801756 5008 scope.go:117] "RemoveContainer" containerID="2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.802133 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390"} err="failed to get container status \"2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390\": rpc error: code = NotFound desc = could not find container \"2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390\": container with ID starting with 2fdf60ebde986cb21ecaddfe8f0f25e0e9931297faa54ac671bdc4651c016390 not found: ID does not exist" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.802189 5008 scope.go:117] "RemoveContainer" containerID="30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.802533 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b"} err="failed to get container status \"30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b\": rpc error: code = NotFound desc = could not find container \"30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b\": container with ID starting with 30a95a3e02a33d3625dff3d48095fe0c2867e8758e6876e6fe9afba5be591e1b not found: ID does not exist" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.802570 5008 scope.go:117] "RemoveContainer" containerID="00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.802912 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead"} err="failed to get container status \"00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead\": rpc error: code = NotFound desc = could not find container \"00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead\": container with ID starting with 00ae5df9c7d456d86760fafa2c8042fcc89f341e45e15ba0f714068cd7658ead not found: ID does not exist" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.901822 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94f5176-d76e-496c-b44e-7b167a690016-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.901904 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e94f5176-d76e-496c-b44e-7b167a690016-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.901934 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.901981 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-run\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.902155 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.902299 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94f5176-d76e-496c-b44e-7b167a690016-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.902373 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-sys\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.902476 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.902533 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.902716 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.902808 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.902899 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-dev\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.903006 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g98w\" (UniqueName: \"kubernetes.io/projected/e94f5176-d76e-496c-b44e-7b167a690016-kube-api-access-8g98w\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:43 crc kubenswrapper[5008]: I1126 22:58:43.903111 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94f5176-d76e-496c-b44e-7b167a690016-logs\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.004172 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-run\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.004840 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.004316 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-run\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.004934 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94f5176-d76e-496c-b44e-7b167a690016-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005040 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-sys\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005051 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005111 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005161 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005248 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005246 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-sys\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005305 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005371 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-dev\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005428 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g98w\" (UniqueName: \"kubernetes.io/projected/e94f5176-d76e-496c-b44e-7b167a690016-kube-api-access-8g98w\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005486 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005529 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94f5176-d76e-496c-b44e-7b167a690016-logs\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005562 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-dev\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005636 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94f5176-d76e-496c-b44e-7b167a690016-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005661 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005821 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e94f5176-d76e-496c-b44e-7b167a690016-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005873 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.005425 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.006371 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.007458 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94f5176-d76e-496c-b44e-7b167a690016-logs\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.007627 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.008619 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e94f5176-d76e-496c-b44e-7b167a690016-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.011432 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94f5176-d76e-496c-b44e-7b167a690016-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.018614 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94f5176-d76e-496c-b44e-7b167a690016-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.044431 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g98w\" (UniqueName: \"kubernetes.io/projected/e94f5176-d76e-496c-b44e-7b167a690016-kube-api-access-8g98w\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.083653 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.087756 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.092780 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.538337 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:58:44 crc kubenswrapper[5008]: I1126 22:58:44.672016 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e94f5176-d76e-496c-b44e-7b167a690016","Type":"ContainerStarted","Data":"af6e260439f078604aef07fb4e58f26fd6be15fed2a2d02ce729b51f700c89b0"} Nov 26 22:58:45 crc kubenswrapper[5008]: I1126 22:58:45.537913 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b6531e4-57d7-4174-81b3-56a2bc515001" path="/var/lib/kubelet/pods/0b6531e4-57d7-4174-81b3-56a2bc515001/volumes" Nov 26 22:58:45 crc kubenswrapper[5008]: I1126 22:58:45.681393 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e94f5176-d76e-496c-b44e-7b167a690016","Type":"ContainerStarted","Data":"6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258"} Nov 26 22:58:45 crc kubenswrapper[5008]: I1126 22:58:45.681704 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e94f5176-d76e-496c-b44e-7b167a690016","Type":"ContainerStarted","Data":"8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138"} Nov 26 22:58:45 crc kubenswrapper[5008]: I1126 22:58:45.681783 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e94f5176-d76e-496c-b44e-7b167a690016","Type":"ContainerStarted","Data":"7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721"} Nov 26 22:58:45 crc kubenswrapper[5008]: I1126 22:58:45.722205 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.722183052 podStartE2EDuration="2.722183052s" podCreationTimestamp="2025-11-26 22:58:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:58:45.719673073 +0000 UTC m=+1201.132367075" watchObservedRunningTime="2025-11-26 22:58:45.722183052 +0000 UTC m=+1201.134877064" Nov 26 22:58:51 crc kubenswrapper[5008]: I1126 22:58:51.112133 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:51 crc kubenswrapper[5008]: I1126 22:58:51.114732 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:51 crc kubenswrapper[5008]: I1126 22:58:51.114760 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:51 crc kubenswrapper[5008]: I1126 22:58:51.158799 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:51 crc kubenswrapper[5008]: I1126 22:58:51.162039 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:51 crc kubenswrapper[5008]: I1126 22:58:51.185434 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:51 crc kubenswrapper[5008]: I1126 22:58:51.739282 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:51 crc kubenswrapper[5008]: I1126 22:58:51.739610 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:51 crc kubenswrapper[5008]: I1126 22:58:51.739623 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:51 crc kubenswrapper[5008]: I1126 22:58:51.753254 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:51 crc kubenswrapper[5008]: I1126 22:58:51.758703 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:51 crc kubenswrapper[5008]: I1126 22:58:51.763545 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:58:54 crc kubenswrapper[5008]: I1126 22:58:54.093598 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:54 crc kubenswrapper[5008]: I1126 22:58:54.094631 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:54 crc kubenswrapper[5008]: I1126 22:58:54.094693 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:54 crc kubenswrapper[5008]: I1126 22:58:54.135629 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:54 crc kubenswrapper[5008]: I1126 22:58:54.139960 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:54 crc kubenswrapper[5008]: I1126 22:58:54.160267 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:54 crc kubenswrapper[5008]: I1126 22:58:54.771080 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:54 crc kubenswrapper[5008]: I1126 22:58:54.771157 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:54 crc kubenswrapper[5008]: I1126 22:58:54.771185 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:54 crc kubenswrapper[5008]: I1126 22:58:54.792716 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:54 crc kubenswrapper[5008]: I1126 22:58:54.794363 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:54 crc kubenswrapper[5008]: I1126 22:58:54.797000 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.718845 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.722641 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.728439 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.730887 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.759776 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.771185 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847002 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847067 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1a5d6ee-2595-4413-95be-b4c4656e978d-logs\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847106 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa146f2c-c806-443d-b49c-289beed985a6-config-data\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847259 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1a5d6ee-2595-4413-95be-b4c4656e978d-config-data\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847314 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847337 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-run\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847362 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847383 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847407 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847501 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqc9g\" (UniqueName: \"kubernetes.io/projected/a1a5d6ee-2595-4413-95be-b4c4656e978d-kube-api-access-fqc9g\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847648 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847729 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847796 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1a5d6ee-2595-4413-95be-b4c4656e978d-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847827 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847863 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-dev\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847902 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.847942 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-run\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.848000 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa146f2c-c806-443d-b49c-289beed985a6-logs\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.848033 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa146f2c-c806-443d-b49c-289beed985a6-scripts\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.848084 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.848117 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.848165 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.848200 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa146f2c-c806-443d-b49c-289beed985a6-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.848227 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-sys\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.848283 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdskm\" (UniqueName: \"kubernetes.io/projected/fa146f2c-c806-443d-b49c-289beed985a6-kube-api-access-sdskm\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.848322 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-dev\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.848359 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1a5d6ee-2595-4413-95be-b4c4656e978d-scripts\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.848616 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-sys\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.938709 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.940166 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.952996 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954251 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1a5d6ee-2595-4413-95be-b4c4656e978d-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954291 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954315 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-dev\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954339 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954361 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-run\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954380 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa146f2c-c806-443d-b49c-289beed985a6-logs\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954401 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa146f2c-c806-443d-b49c-289beed985a6-scripts\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954418 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954439 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954473 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954492 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa146f2c-c806-443d-b49c-289beed985a6-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954506 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-sys\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954544 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdskm\" (UniqueName: \"kubernetes.io/projected/fa146f2c-c806-443d-b49c-289beed985a6-kube-api-access-sdskm\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954574 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-dev\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954596 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1a5d6ee-2595-4413-95be-b4c4656e978d-scripts\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954611 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-sys\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954638 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954654 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1a5d6ee-2595-4413-95be-b4c4656e978d-logs\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954673 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa146f2c-c806-443d-b49c-289beed985a6-config-data\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954711 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1a5d6ee-2595-4413-95be-b4c4656e978d-config-data\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954731 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954745 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-run\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954760 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954776 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954795 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954813 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqc9g\" (UniqueName: \"kubernetes.io/projected/a1a5d6ee-2595-4413-95be-b4c4656e978d-kube-api-access-fqc9g\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954826 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.954857 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.955172 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.955232 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.961182 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-dev\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.962048 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1a5d6ee-2595-4413-95be-b4c4656e978d-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.962153 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.962200 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-dev\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.962243 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.962289 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-run\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.962756 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.962801 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa146f2c-c806-443d-b49c-289beed985a6-logs\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.965212 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.965835 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.966136 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.966216 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-sys\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.966516 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1a5d6ee-2595-4413-95be-b4c4656e978d-logs\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.966686 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa146f2c-c806-443d-b49c-289beed985a6-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.966711 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-sys\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.966765 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.966822 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.966871 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.966164 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.967165 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-run\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.967218 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.970478 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa146f2c-c806-443d-b49c-289beed985a6-scripts\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.974072 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.989380 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa146f2c-c806-443d-b49c-289beed985a6-config-data\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.995108 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1a5d6ee-2595-4413-95be-b4c4656e978d-scripts\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:56 crc kubenswrapper[5008]: I1126 22:58:56.999733 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.003126 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1a5d6ee-2595-4413-95be-b4c4656e978d-config-data\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.003351 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqc9g\" (UniqueName: \"kubernetes.io/projected/a1a5d6ee-2595-4413-95be-b4c4656e978d-kube-api-access-fqc9g\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.003574 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.012719 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdskm\" (UniqueName: \"kubernetes.io/projected/fa146f2c-c806-443d-b49c-289beed985a6-kube-api-access-sdskm\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.014868 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-2\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.032557 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.039839 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.055573 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.055950 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhw9s\" (UniqueName: \"kubernetes.io/projected/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-kube-api-access-hhw9s\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.056194 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.056306 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.056406 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-config-data\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.056499 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd420810-75bc-480d-82c6-7d5df62216a4-config-data\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.056583 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.056677 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-run\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.056769 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.056887 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-sys\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057040 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057106 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057125 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-dev\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057241 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057259 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057309 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-logs\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057336 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-run\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057360 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-dev\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057374 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057395 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057417 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd420810-75bc-480d-82c6-7d5df62216a4-logs\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057437 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlztd\" (UniqueName: \"kubernetes.io/projected/bd420810-75bc-480d-82c6-7d5df62216a4-kube-api-access-mlztd\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057459 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd420810-75bc-480d-82c6-7d5df62216a4-scripts\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057473 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-scripts\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057497 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057521 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057539 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-sys\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057556 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.057604 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd420810-75bc-480d-82c6-7d5df62216a4-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.076350 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167295 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167333 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167393 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-logs\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167430 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-run\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167444 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167499 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-dev\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167457 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-dev\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167550 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167562 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167583 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167593 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-run\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167637 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167639 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167683 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd420810-75bc-480d-82c6-7d5df62216a4-logs\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167714 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlztd\" (UniqueName: \"kubernetes.io/projected/bd420810-75bc-480d-82c6-7d5df62216a4-kube-api-access-mlztd\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167748 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd420810-75bc-480d-82c6-7d5df62216a4-scripts\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167770 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-scripts\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167802 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167814 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-logs\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167836 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167868 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-sys\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167895 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.167984 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd420810-75bc-480d-82c6-7d5df62216a4-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168063 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhw9s\" (UniqueName: \"kubernetes.io/projected/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-kube-api-access-hhw9s\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168117 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168163 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168187 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-config-data\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168211 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd420810-75bc-480d-82c6-7d5df62216a4-config-data\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168233 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168257 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-run\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168281 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168317 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-sys\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168341 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168363 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd420810-75bc-480d-82c6-7d5df62216a4-logs\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168376 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168451 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-dev\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168604 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168646 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168704 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.168734 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-run\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.169259 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.169301 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-sys\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.170750 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-sys\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.170792 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.170800 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.170857 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.170889 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-dev\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.171386 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd420810-75bc-480d-82c6-7d5df62216a4-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.171431 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.171482 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.173564 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd420810-75bc-480d-82c6-7d5df62216a4-scripts\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.179858 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-scripts\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.185981 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-config-data\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.191927 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd420810-75bc-480d-82c6-7d5df62216a4-config-data\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.193838 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhw9s\" (UniqueName: \"kubernetes.io/projected/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-kube-api-access-hhw9s\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.200232 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlztd\" (UniqueName: \"kubernetes.io/projected/bd420810-75bc-480d-82c6-7d5df62216a4-kube-api-access-mlztd\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.201238 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.203528 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-1\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.219189 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.226655 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-2\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.268337 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.433417 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.502771 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 22:58:57 crc kubenswrapper[5008]: W1126 22:58:57.506693 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1a5d6ee_2595_4413_95be_b4c4656e978d.slice/crio-0e1e245a920e21a774b2c096f086588e7399263a3a04b56d9b3290157b6de938 WatchSource:0}: Error finding container 0e1e245a920e21a774b2c096f086588e7399263a3a04b56d9b3290157b6de938: Status 404 returned error can't find the container with id 0e1e245a920e21a774b2c096f086588e7399263a3a04b56d9b3290157b6de938 Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.620262 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.671049 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 22:58:57 crc kubenswrapper[5008]: W1126 22:58:57.683205 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21ce0079_9cf2_4a67_b731_fc1b02efaf3c.slice/crio-fcd08b7cf8da2474f2cc0894ff869655491991e2608910217e7a55d06ebc3c83 WatchSource:0}: Error finding container fcd08b7cf8da2474f2cc0894ff869655491991e2608910217e7a55d06ebc3c83: Status 404 returned error can't find the container with id fcd08b7cf8da2474f2cc0894ff869655491991e2608910217e7a55d06ebc3c83 Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.807740 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"fa146f2c-c806-443d-b49c-289beed985a6","Type":"ContainerStarted","Data":"8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88"} Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.807779 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"fa146f2c-c806-443d-b49c-289beed985a6","Type":"ContainerStarted","Data":"dea1453ec8d129b53e64ebcde4e0eacd3fa9c1f004635d6038ba4c979f1b76cd"} Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.809282 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"21ce0079-9cf2-4a67-b731-fc1b02efaf3c","Type":"ContainerStarted","Data":"fcd08b7cf8da2474f2cc0894ff869655491991e2608910217e7a55d06ebc3c83"} Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.813156 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"a1a5d6ee-2595-4413-95be-b4c4656e978d","Type":"ContainerStarted","Data":"12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf"} Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.813176 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"a1a5d6ee-2595-4413-95be-b4c4656e978d","Type":"ContainerStarted","Data":"0e1e245a920e21a774b2c096f086588e7399263a3a04b56d9b3290157b6de938"} Nov 26 22:58:57 crc kubenswrapper[5008]: I1126 22:58:57.905549 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.825779 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"fa146f2c-c806-443d-b49c-289beed985a6","Type":"ContainerStarted","Data":"f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925"} Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.826521 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"fa146f2c-c806-443d-b49c-289beed985a6","Type":"ContainerStarted","Data":"363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4"} Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.828866 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"bd420810-75bc-480d-82c6-7d5df62216a4","Type":"ContainerStarted","Data":"028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95"} Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.828922 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"bd420810-75bc-480d-82c6-7d5df62216a4","Type":"ContainerStarted","Data":"e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616"} Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.828988 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"bd420810-75bc-480d-82c6-7d5df62216a4","Type":"ContainerStarted","Data":"a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58"} Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.829008 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"bd420810-75bc-480d-82c6-7d5df62216a4","Type":"ContainerStarted","Data":"cec8e48a082fc996934f12eae717b1d62ae2ef7d5c511e2b6e2e86ab62ab9c25"} Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.832254 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"21ce0079-9cf2-4a67-b731-fc1b02efaf3c","Type":"ContainerStarted","Data":"d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb"} Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.832290 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"21ce0079-9cf2-4a67-b731-fc1b02efaf3c","Type":"ContainerStarted","Data":"762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc"} Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.832299 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"21ce0079-9cf2-4a67-b731-fc1b02efaf3c","Type":"ContainerStarted","Data":"9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c"} Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.834823 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"a1a5d6ee-2595-4413-95be-b4c4656e978d","Type":"ContainerStarted","Data":"e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2"} Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.834873 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"a1a5d6ee-2595-4413-95be-b4c4656e978d","Type":"ContainerStarted","Data":"9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4"} Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.870378 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=3.870359555 podStartE2EDuration="3.870359555s" podCreationTimestamp="2025-11-26 22:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:58:58.869877329 +0000 UTC m=+1214.282571341" watchObservedRunningTime="2025-11-26 22:58:58.870359555 +0000 UTC m=+1214.283053547" Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.912153 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=3.9121292370000003 podStartE2EDuration="3.912129237s" podCreationTimestamp="2025-11-26 22:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:58:58.908070909 +0000 UTC m=+1214.320764921" watchObservedRunningTime="2025-11-26 22:58:58.912129237 +0000 UTC m=+1214.324823259" Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.964255 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=3.964236473 podStartE2EDuration="3.964236473s" podCreationTimestamp="2025-11-26 22:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:58:58.96320016 +0000 UTC m=+1214.375894162" watchObservedRunningTime="2025-11-26 22:58:58.964236473 +0000 UTC m=+1214.376930475" Nov 26 22:58:58 crc kubenswrapper[5008]: I1126 22:58:58.989725 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.989705453 podStartE2EDuration="3.989705453s" podCreationTimestamp="2025-11-26 22:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:58:58.988284019 +0000 UTC m=+1214.400978041" watchObservedRunningTime="2025-11-26 22:58:58.989705453 +0000 UTC m=+1214.402399465" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.055934 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.056523 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.056537 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.077485 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.077807 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.077820 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.095010 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.103551 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.106141 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.106735 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.110178 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.251707 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.268702 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.268742 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.269547 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.295508 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.302521 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.306267 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.434204 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.434344 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.434400 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.465487 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.468437 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.512409 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.925557 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.925944 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.926075 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.926103 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.926126 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.926153 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.926179 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.926204 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.926229 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.926256 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.926276 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.926295 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.942586 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.944642 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.948683 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.955384 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.956556 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.958736 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.965364 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.965462 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.966491 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.971219 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.971332 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:07 crc kubenswrapper[5008]: I1126 22:59:07.977739 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:08 crc kubenswrapper[5008]: I1126 22:59:08.512128 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Nov 26 22:59:08 crc kubenswrapper[5008]: I1126 22:59:08.530019 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 22:59:08 crc kubenswrapper[5008]: I1126 22:59:08.655897 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 26 22:59:08 crc kubenswrapper[5008]: I1126 22:59:08.661277 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 22:59:09 crc kubenswrapper[5008]: I1126 22:59:09.942543 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerName="glance-log" containerID="cri-o://9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c" gracePeriod=30 Nov 26 22:59:09 crc kubenswrapper[5008]: I1126 22:59:09.942622 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerName="glance-api" containerID="cri-o://d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb" gracePeriod=30 Nov 26 22:59:09 crc kubenswrapper[5008]: I1126 22:59:09.942630 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerName="glance-httpd" containerID="cri-o://762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc" gracePeriod=30 Nov 26 22:59:09 crc kubenswrapper[5008]: I1126 22:59:09.942890 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerName="glance-log" containerID="cri-o://12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf" gracePeriod=30 Nov 26 22:59:09 crc kubenswrapper[5008]: I1126 22:59:09.943041 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerName="glance-api" containerID="cri-o://e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2" gracePeriod=30 Nov 26 22:59:09 crc kubenswrapper[5008]: I1126 22:59:09.943085 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerName="glance-httpd" containerID="cri-o://9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4" gracePeriod=30 Nov 26 22:59:09 crc kubenswrapper[5008]: I1126 22:59:09.943119 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="bd420810-75bc-480d-82c6-7d5df62216a4" containerName="glance-log" containerID="cri-o://a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58" gracePeriod=30 Nov 26 22:59:09 crc kubenswrapper[5008]: I1126 22:59:09.943222 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="bd420810-75bc-480d-82c6-7d5df62216a4" containerName="glance-api" containerID="cri-o://028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95" gracePeriod=30 Nov 26 22:59:09 crc kubenswrapper[5008]: I1126 22:59:09.943254 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="bd420810-75bc-480d-82c6-7d5df62216a4" containerName="glance-httpd" containerID="cri-o://e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616" gracePeriod=30 Nov 26 22:59:09 crc kubenswrapper[5008]: I1126 22:59:09.943495 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="fa146f2c-c806-443d-b49c-289beed985a6" containerName="glance-log" containerID="cri-o://8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88" gracePeriod=30 Nov 26 22:59:09 crc kubenswrapper[5008]: I1126 22:59:09.943736 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="fa146f2c-c806-443d-b49c-289beed985a6" containerName="glance-api" containerID="cri-o://f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925" gracePeriod=30 Nov 26 22:59:09 crc kubenswrapper[5008]: I1126 22:59:09.943814 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="fa146f2c-c806-443d-b49c-289beed985a6" containerName="glance-httpd" containerID="cri-o://363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4" gracePeriod=30 Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.680632 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.802827 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdskm\" (UniqueName: \"kubernetes.io/projected/fa146f2c-c806-443d-b49c-289beed985a6-kube-api-access-sdskm\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.802871 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-run\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.802896 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.802988 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-var-locks-brick\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.803026 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa146f2c-c806-443d-b49c-289beed985a6-logs\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.803047 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-etc-iscsi\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.803073 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-sys\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.803160 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.803184 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-etc-nvme\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.803229 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa146f2c-c806-443d-b49c-289beed985a6-config-data\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.803401 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa146f2c-c806-443d-b49c-289beed985a6-httpd-run\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.803442 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-dev\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.803470 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa146f2c-c806-443d-b49c-289beed985a6-scripts\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.803505 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-lib-modules\") pod \"fa146f2c-c806-443d-b49c-289beed985a6\" (UID: \"fa146f2c-c806-443d-b49c-289beed985a6\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.803902 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.804508 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.804590 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-dev" (OuterVolumeSpecName: "dev") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.804613 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-run" (OuterVolumeSpecName: "run") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.804760 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa146f2c-c806-443d-b49c-289beed985a6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.804883 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-sys" (OuterVolumeSpecName: "sys") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.804942 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.805038 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa146f2c-c806-443d-b49c-289beed985a6-logs" (OuterVolumeSpecName: "logs") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.805080 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.809260 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.809931 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa146f2c-c806-443d-b49c-289beed985a6-scripts" (OuterVolumeSpecName: "scripts") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.810061 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa146f2c-c806-443d-b49c-289beed985a6-kube-api-access-sdskm" (OuterVolumeSpecName: "kube-api-access-sdskm") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "kube-api-access-sdskm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.810385 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance-cache") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.834456 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.904668 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1a5d6ee-2595-4413-95be-b4c4656e978d-httpd-run\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.904716 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-run\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.904745 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-var-locks-brick\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.904758 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-sys\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.904789 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-etc-iscsi\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.904811 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqc9g\" (UniqueName: \"kubernetes.io/projected/a1a5d6ee-2595-4413-95be-b4c4656e978d-kube-api-access-fqc9g\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.904883 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1a5d6ee-2595-4413-95be-b4c4656e978d-scripts\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.904896 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-etc-nvme\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.904918 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1a5d6ee-2595-4413-95be-b4c4656e978d-logs\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.904933 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1a5d6ee-2595-4413-95be-b4c4656e978d-config-data\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.904950 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.904996 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-lib-modules\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905026 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-dev\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905044 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"a1a5d6ee-2595-4413-95be-b4c4656e978d\" (UID: \"a1a5d6ee-2595-4413-95be-b4c4656e978d\") " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905342 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905354 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905362 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa146f2c-c806-443d-b49c-289beed985a6-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905370 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-dev\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905378 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa146f2c-c806-443d-b49c-289beed985a6-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905385 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905394 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdskm\" (UniqueName: \"kubernetes.io/projected/fa146f2c-c806-443d-b49c-289beed985a6-kube-api-access-sdskm\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905403 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905417 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905426 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905436 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905443 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa146f2c-c806-443d-b49c-289beed985a6-logs\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905466 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fa146f2c-c806-443d-b49c-289beed985a6-sys\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905874 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1a5d6ee-2595-4413-95be-b4c4656e978d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905918 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-run" (OuterVolumeSpecName: "run") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905943 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.905985 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-sys" (OuterVolumeSpecName: "sys") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.906013 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.907445 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-dev" (OuterVolumeSpecName: "dev") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.907502 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.907799 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1a5d6ee-2595-4413-95be-b4c4656e978d-logs" (OuterVolumeSpecName: "logs") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.907828 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.909086 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1a5d6ee-2595-4413-95be-b4c4656e978d-kube-api-access-fqc9g" (OuterVolumeSpecName: "kube-api-access-fqc9g") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "kube-api-access-fqc9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.910006 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance-cache") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.915578 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.916331 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a5d6ee-2595-4413-95be-b4c4656e978d-scripts" (OuterVolumeSpecName: "scripts") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.919125 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa146f2c-c806-443d-b49c-289beed985a6-config-data" (OuterVolumeSpecName: "config-data") pod "fa146f2c-c806-443d-b49c-289beed985a6" (UID: "fa146f2c-c806-443d-b49c-289beed985a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.921499 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.925044 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.928284 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.931146 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.960637 5008 generic.go:334] "Generic (PLEG): container finished" podID="fa146f2c-c806-443d-b49c-289beed985a6" containerID="f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925" exitCode=0 Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.960669 5008 generic.go:334] "Generic (PLEG): container finished" podID="fa146f2c-c806-443d-b49c-289beed985a6" containerID="363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4" exitCode=0 Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.960676 5008 generic.go:334] "Generic (PLEG): container finished" podID="fa146f2c-c806-443d-b49c-289beed985a6" containerID="8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88" exitCode=143 Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.960742 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"fa146f2c-c806-443d-b49c-289beed985a6","Type":"ContainerDied","Data":"f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925"} Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.960913 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"fa146f2c-c806-443d-b49c-289beed985a6","Type":"ContainerDied","Data":"363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4"} Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.960927 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"fa146f2c-c806-443d-b49c-289beed985a6","Type":"ContainerDied","Data":"8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88"} Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.963085 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.963593 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"fa146f2c-c806-443d-b49c-289beed985a6","Type":"ContainerDied","Data":"dea1453ec8d129b53e64ebcde4e0eacd3fa9c1f004635d6038ba4c979f1b76cd"} Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.963625 5008 scope.go:117] "RemoveContainer" containerID="f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.973315 5008 generic.go:334] "Generic (PLEG): container finished" podID="bd420810-75bc-480d-82c6-7d5df62216a4" containerID="028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95" exitCode=0 Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.973391 5008 generic.go:334] "Generic (PLEG): container finished" podID="bd420810-75bc-480d-82c6-7d5df62216a4" containerID="e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616" exitCode=0 Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.973401 5008 generic.go:334] "Generic (PLEG): container finished" podID="bd420810-75bc-480d-82c6-7d5df62216a4" containerID="a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58" exitCode=143 Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.973540 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"bd420810-75bc-480d-82c6-7d5df62216a4","Type":"ContainerDied","Data":"028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95"} Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.973580 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"bd420810-75bc-480d-82c6-7d5df62216a4","Type":"ContainerDied","Data":"e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616"} Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.973591 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"bd420810-75bc-480d-82c6-7d5df62216a4","Type":"ContainerDied","Data":"a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58"} Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.973601 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"bd420810-75bc-480d-82c6-7d5df62216a4","Type":"ContainerDied","Data":"cec8e48a082fc996934f12eae717b1d62ae2ef7d5c511e2b6e2e86ab62ab9c25"} Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.973685 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.988799 5008 generic.go:334] "Generic (PLEG): container finished" podID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerID="d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb" exitCode=0 Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.988828 5008 generic.go:334] "Generic (PLEG): container finished" podID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerID="762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc" exitCode=0 Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.988837 5008 generic.go:334] "Generic (PLEG): container finished" podID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerID="9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c" exitCode=143 Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.988876 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"21ce0079-9cf2-4a67-b731-fc1b02efaf3c","Type":"ContainerDied","Data":"d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb"} Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.988904 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"21ce0079-9cf2-4a67-b731-fc1b02efaf3c","Type":"ContainerDied","Data":"762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc"} Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.988915 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"21ce0079-9cf2-4a67-b731-fc1b02efaf3c","Type":"ContainerDied","Data":"9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c"} Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.988924 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"21ce0079-9cf2-4a67-b731-fc1b02efaf3c","Type":"ContainerDied","Data":"fcd08b7cf8da2474f2cc0894ff869655491991e2608910217e7a55d06ebc3c83"} Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.989006 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:10 crc kubenswrapper[5008]: I1126 22:59:10.998905 5008 scope.go:117] "RemoveContainer" containerID="363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.003348 5008 generic.go:334] "Generic (PLEG): container finished" podID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerID="e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2" exitCode=0 Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.003383 5008 generic.go:334] "Generic (PLEG): container finished" podID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerID="9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4" exitCode=0 Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.003393 5008 generic.go:334] "Generic (PLEG): container finished" podID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerID="12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf" exitCode=143 Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.003415 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"a1a5d6ee-2595-4413-95be-b4c4656e978d","Type":"ContainerDied","Data":"e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2"} Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.003455 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"a1a5d6ee-2595-4413-95be-b4c4656e978d","Type":"ContainerDied","Data":"9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4"} Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.003467 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"a1a5d6ee-2595-4413-95be-b4c4656e978d","Type":"ContainerDied","Data":"12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf"} Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.003476 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"a1a5d6ee-2595-4413-95be-b4c4656e978d","Type":"ContainerDied","Data":"0e1e245a920e21a774b2c096f086588e7399263a3a04b56d9b3290157b6de938"} Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.003792 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.013668 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-run\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.013726 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.013756 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-var-locks-brick\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.013781 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-lib-modules\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.013819 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd420810-75bc-480d-82c6-7d5df62216a4-httpd-run\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.013846 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.013884 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-sys\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.013913 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-logs\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.013947 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-httpd-run\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.013996 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014038 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-scripts\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014074 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-var-locks-brick\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014107 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-etc-nvme\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014159 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-etc-iscsi\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014201 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhw9s\" (UniqueName: \"kubernetes.io/projected/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-kube-api-access-hhw9s\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014233 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-run\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014260 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-etc-nvme\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014287 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlztd\" (UniqueName: \"kubernetes.io/projected/bd420810-75bc-480d-82c6-7d5df62216a4-kube-api-access-mlztd\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014313 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-config-data\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014338 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd420810-75bc-480d-82c6-7d5df62216a4-logs\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014367 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd420810-75bc-480d-82c6-7d5df62216a4-config-data\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014398 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-etc-iscsi\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014424 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-sys\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014446 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd420810-75bc-480d-82c6-7d5df62216a4-scripts\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014476 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-dev\") pod \"bd420810-75bc-480d-82c6-7d5df62216a4\" (UID: \"bd420810-75bc-480d-82c6-7d5df62216a4\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014500 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014524 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-lib-modules\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.014559 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-dev\") pod \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\" (UID: \"21ce0079-9cf2-4a67-b731-fc1b02efaf3c\") " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.015548 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.015572 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-run" (OuterVolumeSpecName: "run") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.015589 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.015651 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.015930 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.016010 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-run" (OuterVolumeSpecName: "run") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.016248 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-sys" (OuterVolumeSpecName: "sys") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.016316 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd420810-75bc-480d-82c6-7d5df62216a4-logs" (OuterVolumeSpecName: "logs") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.016834 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.016879 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.017140 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-sys" (OuterVolumeSpecName: "sys") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.017152 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-logs" (OuterVolumeSpecName: "logs") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.017168 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.017184 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.017535 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-dev" (OuterVolumeSpecName: "dev") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.017720 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-dev" (OuterVolumeSpecName: "dev") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.017950 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1a5d6ee-2595-4413-95be-b4c4656e978d-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.017982 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.017994 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-dev\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.018003 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1a5d6ee-2595-4413-95be-b4c4656e978d-logs\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019036 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019062 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019084 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019093 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019108 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019117 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-dev\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019127 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-sys\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019152 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019163 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019172 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-logs\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019184 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1a5d6ee-2595-4413-95be-b4c4656e978d-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019192 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019201 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019210 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019222 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019231 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-sys\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019240 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019249 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a1a5d6ee-2595-4413-95be-b4c4656e978d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019262 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqc9g\" (UniqueName: \"kubernetes.io/projected/a1a5d6ee-2595-4413-95be-b4c4656e978d-kube-api-access-fqc9g\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019273 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019281 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019291 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd420810-75bc-480d-82c6-7d5df62216a4-logs\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019302 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019312 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019320 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-sys\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019332 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa146f2c-c806-443d-b49c-289beed985a6-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019342 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bd420810-75bc-480d-82c6-7d5df62216a4-dev\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019350 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019447 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019465 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd420810-75bc-480d-82c6-7d5df62216a4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.019580 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd420810-75bc-480d-82c6-7d5df62216a4-kube-api-access-mlztd" (OuterVolumeSpecName: "kube-api-access-mlztd") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "kube-api-access-mlztd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.020910 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.020993 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-kube-api-access-hhw9s" (OuterVolumeSpecName: "kube-api-access-hhw9s") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "kube-api-access-hhw9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.021059 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd420810-75bc-480d-82c6-7d5df62216a4-scripts" (OuterVolumeSpecName: "scripts") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.023273 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-scripts" (OuterVolumeSpecName: "scripts") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.027834 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.030039 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance-cache") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.036727 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.045995 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.046573 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.053200 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.055565 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.077714 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a5d6ee-2595-4413-95be-b4c4656e978d-config-data" (OuterVolumeSpecName: "config-data") pod "a1a5d6ee-2595-4413-95be-b4c4656e978d" (UID: "a1a5d6ee-2595-4413-95be-b4c4656e978d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.110935 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-config-data" (OuterVolumeSpecName: "config-data") pod "21ce0079-9cf2-4a67-b731-fc1b02efaf3c" (UID: "21ce0079-9cf2-4a67-b731-fc1b02efaf3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.119253 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd420810-75bc-480d-82c6-7d5df62216a4-config-data" (OuterVolumeSpecName: "config-data") pod "bd420810-75bc-480d-82c6-7d5df62216a4" (UID: "bd420810-75bc-480d-82c6-7d5df62216a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.120667 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhw9s\" (UniqueName: \"kubernetes.io/projected/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-kube-api-access-hhw9s\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.120753 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlztd\" (UniqueName: \"kubernetes.io/projected/bd420810-75bc-480d-82c6-7d5df62216a4-kube-api-access-mlztd\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.120807 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.120857 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd420810-75bc-480d-82c6-7d5df62216a4-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.120911 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd420810-75bc-480d-82c6-7d5df62216a4-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.121008 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.121092 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1a5d6ee-2595-4413-95be-b4c4656e978d-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.121143 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.121208 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.121260 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd420810-75bc-480d-82c6-7d5df62216a4-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.121323 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.121374 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.121423 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.121476 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.121525 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21ce0079-9cf2-4a67-b731-fc1b02efaf3c-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.134979 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.137267 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.139307 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.139672 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.163149 5008 scope.go:117] "RemoveContainer" containerID="8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.183919 5008 scope.go:117] "RemoveContainer" containerID="f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925" Nov 26 22:59:11 crc kubenswrapper[5008]: E1126 22:59:11.184410 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925\": container with ID starting with f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925 not found: ID does not exist" containerID="f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.184537 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925"} err="failed to get container status \"f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925\": rpc error: code = NotFound desc = could not find container \"f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925\": container with ID starting with f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.184632 5008 scope.go:117] "RemoveContainer" containerID="363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4" Nov 26 22:59:11 crc kubenswrapper[5008]: E1126 22:59:11.185081 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4\": container with ID starting with 363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4 not found: ID does not exist" containerID="363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.185127 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4"} err="failed to get container status \"363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4\": rpc error: code = NotFound desc = could not find container \"363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4\": container with ID starting with 363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.185159 5008 scope.go:117] "RemoveContainer" containerID="8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88" Nov 26 22:59:11 crc kubenswrapper[5008]: E1126 22:59:11.185568 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88\": container with ID starting with 8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88 not found: ID does not exist" containerID="8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.185616 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88"} err="failed to get container status \"8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88\": rpc error: code = NotFound desc = could not find container \"8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88\": container with ID starting with 8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.185643 5008 scope.go:117] "RemoveContainer" containerID="f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.185906 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925"} err="failed to get container status \"f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925\": rpc error: code = NotFound desc = could not find container \"f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925\": container with ID starting with f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.186056 5008 scope.go:117] "RemoveContainer" containerID="363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.186401 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4"} err="failed to get container status \"363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4\": rpc error: code = NotFound desc = could not find container \"363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4\": container with ID starting with 363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.186423 5008 scope.go:117] "RemoveContainer" containerID="8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.186744 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88"} err="failed to get container status \"8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88\": rpc error: code = NotFound desc = could not find container \"8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88\": container with ID starting with 8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.186767 5008 scope.go:117] "RemoveContainer" containerID="f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.187001 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925"} err="failed to get container status \"f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925\": rpc error: code = NotFound desc = could not find container \"f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925\": container with ID starting with f47353f7f889229050ff7ae088d0eace2e00c84545ca04c4b2e1bb686b026925 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.187022 5008 scope.go:117] "RemoveContainer" containerID="363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.187234 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4"} err="failed to get container status \"363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4\": rpc error: code = NotFound desc = could not find container \"363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4\": container with ID starting with 363bd0dd0d605946b0c730523645c163c5d0976afea4012560d123a6c83be9d4 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.187258 5008 scope.go:117] "RemoveContainer" containerID="8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.187565 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88"} err="failed to get container status \"8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88\": rpc error: code = NotFound desc = could not find container \"8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88\": container with ID starting with 8d5476da85520bb493832b46f4cdd2f49a589d86e2015c196d1c5fa3e2f6cf88 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.187583 5008 scope.go:117] "RemoveContainer" containerID="028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.207310 5008 scope.go:117] "RemoveContainer" containerID="e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.222480 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.222505 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.222513 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.222522 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.225414 5008 scope.go:117] "RemoveContainer" containerID="a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.243477 5008 scope.go:117] "RemoveContainer" containerID="028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95" Nov 26 22:59:11 crc kubenswrapper[5008]: E1126 22:59:11.243886 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95\": container with ID starting with 028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95 not found: ID does not exist" containerID="028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.243923 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95"} err="failed to get container status \"028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95\": rpc error: code = NotFound desc = could not find container \"028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95\": container with ID starting with 028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.243991 5008 scope.go:117] "RemoveContainer" containerID="e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616" Nov 26 22:59:11 crc kubenswrapper[5008]: E1126 22:59:11.244373 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616\": container with ID starting with e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616 not found: ID does not exist" containerID="e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.244491 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616"} err="failed to get container status \"e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616\": rpc error: code = NotFound desc = could not find container \"e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616\": container with ID starting with e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.244578 5008 scope.go:117] "RemoveContainer" containerID="a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58" Nov 26 22:59:11 crc kubenswrapper[5008]: E1126 22:59:11.244982 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58\": container with ID starting with a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58 not found: ID does not exist" containerID="a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.245001 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58"} err="failed to get container status \"a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58\": rpc error: code = NotFound desc = could not find container \"a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58\": container with ID starting with a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.245013 5008 scope.go:117] "RemoveContainer" containerID="028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.245305 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95"} err="failed to get container status \"028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95\": rpc error: code = NotFound desc = could not find container \"028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95\": container with ID starting with 028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.245321 5008 scope.go:117] "RemoveContainer" containerID="e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.245575 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616"} err="failed to get container status \"e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616\": rpc error: code = NotFound desc = could not find container \"e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616\": container with ID starting with e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.245696 5008 scope.go:117] "RemoveContainer" containerID="a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.246177 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58"} err="failed to get container status \"a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58\": rpc error: code = NotFound desc = could not find container \"a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58\": container with ID starting with a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.246210 5008 scope.go:117] "RemoveContainer" containerID="028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.246563 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95"} err="failed to get container status \"028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95\": rpc error: code = NotFound desc = could not find container \"028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95\": container with ID starting with 028ab6c97321370ec35938d541a26ad450baa5d398233a2fad7c22f3dbb69e95 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.246580 5008 scope.go:117] "RemoveContainer" containerID="e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.246812 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616"} err="failed to get container status \"e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616\": rpc error: code = NotFound desc = could not find container \"e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616\": container with ID starting with e1e98fc277d510d7b47ea177ef85fd3a03f4c1740be27d9320f90e4f77217616 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.246907 5008 scope.go:117] "RemoveContainer" containerID="a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.247254 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58"} err="failed to get container status \"a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58\": rpc error: code = NotFound desc = could not find container \"a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58\": container with ID starting with a790c3405fc6fb9d5ed82905cce3a6cebfb07308274deb5085c745f3a7271a58 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.247274 5008 scope.go:117] "RemoveContainer" containerID="d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.278763 5008 scope.go:117] "RemoveContainer" containerID="762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.312775 5008 scope.go:117] "RemoveContainer" containerID="9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.317531 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.324172 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.342839 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.351294 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.357249 5008 scope.go:117] "RemoveContainer" containerID="d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb" Nov 26 22:59:11 crc kubenswrapper[5008]: E1126 22:59:11.357714 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb\": container with ID starting with d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb not found: ID does not exist" containerID="d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.357865 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb"} err="failed to get container status \"d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb\": rpc error: code = NotFound desc = could not find container \"d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb\": container with ID starting with d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.358008 5008 scope.go:117] "RemoveContainer" containerID="762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.358181 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 22:59:11 crc kubenswrapper[5008]: E1126 22:59:11.358456 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc\": container with ID starting with 762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc not found: ID does not exist" containerID="762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.358557 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc"} err="failed to get container status \"762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc\": rpc error: code = NotFound desc = could not find container \"762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc\": container with ID starting with 762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.358672 5008 scope.go:117] "RemoveContainer" containerID="9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c" Nov 26 22:59:11 crc kubenswrapper[5008]: E1126 22:59:11.359040 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c\": container with ID starting with 9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c not found: ID does not exist" containerID="9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.359072 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c"} err="failed to get container status \"9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c\": rpc error: code = NotFound desc = could not find container \"9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c\": container with ID starting with 9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.359096 5008 scope.go:117] "RemoveContainer" containerID="d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.359453 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb"} err="failed to get container status \"d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb\": rpc error: code = NotFound desc = could not find container \"d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb\": container with ID starting with d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.359535 5008 scope.go:117] "RemoveContainer" containerID="762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.359843 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc"} err="failed to get container status \"762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc\": rpc error: code = NotFound desc = could not find container \"762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc\": container with ID starting with 762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.359872 5008 scope.go:117] "RemoveContainer" containerID="9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.360086 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c"} err="failed to get container status \"9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c\": rpc error: code = NotFound desc = could not find container \"9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c\": container with ID starting with 9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.360106 5008 scope.go:117] "RemoveContainer" containerID="d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.360262 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb"} err="failed to get container status \"d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb\": rpc error: code = NotFound desc = could not find container \"d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb\": container with ID starting with d279aa43774cc7af5a3d6527476d6e923b67c06f50027c141e9ea878fffa02eb not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.360282 5008 scope.go:117] "RemoveContainer" containerID="762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.360462 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc"} err="failed to get container status \"762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc\": rpc error: code = NotFound desc = could not find container \"762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc\": container with ID starting with 762eea5398839097fea1e31685dc781eb7578dead0ed16006f4ae21566989cdc not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.360481 5008 scope.go:117] "RemoveContainer" containerID="9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.360630 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c"} err="failed to get container status \"9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c\": rpc error: code = NotFound desc = could not find container \"9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c\": container with ID starting with 9f64a05bcc16832bfb4be493662b0155dd157ef6434e9f3364a669bc1ea9201c not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.360649 5008 scope.go:117] "RemoveContainer" containerID="e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.363925 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.397830 5008 scope.go:117] "RemoveContainer" containerID="9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.421421 5008 scope.go:117] "RemoveContainer" containerID="12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.440485 5008 scope.go:117] "RemoveContainer" containerID="e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2" Nov 26 22:59:11 crc kubenswrapper[5008]: E1126 22:59:11.441041 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2\": container with ID starting with e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2 not found: ID does not exist" containerID="e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.441077 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2"} err="failed to get container status \"e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2\": rpc error: code = NotFound desc = could not find container \"e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2\": container with ID starting with e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.441098 5008 scope.go:117] "RemoveContainer" containerID="9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4" Nov 26 22:59:11 crc kubenswrapper[5008]: E1126 22:59:11.441431 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4\": container with ID starting with 9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4 not found: ID does not exist" containerID="9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.441453 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4"} err="failed to get container status \"9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4\": rpc error: code = NotFound desc = could not find container \"9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4\": container with ID starting with 9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.441466 5008 scope.go:117] "RemoveContainer" containerID="12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf" Nov 26 22:59:11 crc kubenswrapper[5008]: E1126 22:59:11.442043 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf\": container with ID starting with 12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf not found: ID does not exist" containerID="12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.442064 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf"} err="failed to get container status \"12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf\": rpc error: code = NotFound desc = could not find container \"12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf\": container with ID starting with 12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.442077 5008 scope.go:117] "RemoveContainer" containerID="e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.442439 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2"} err="failed to get container status \"e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2\": rpc error: code = NotFound desc = could not find container \"e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2\": container with ID starting with e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.442462 5008 scope.go:117] "RemoveContainer" containerID="9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.442954 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4"} err="failed to get container status \"9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4\": rpc error: code = NotFound desc = could not find container \"9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4\": container with ID starting with 9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.443006 5008 scope.go:117] "RemoveContainer" containerID="12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.443253 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf"} err="failed to get container status \"12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf\": rpc error: code = NotFound desc = could not find container \"12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf\": container with ID starting with 12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.443273 5008 scope.go:117] "RemoveContainer" containerID="e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.443513 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2"} err="failed to get container status \"e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2\": rpc error: code = NotFound desc = could not find container \"e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2\": container with ID starting with e79a317423ecc516e3e0082f14403a4ea5665caf43d09c2fcdfafacb570ab2a2 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.443546 5008 scope.go:117] "RemoveContainer" containerID="9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.443776 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4"} err="failed to get container status \"9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4\": rpc error: code = NotFound desc = could not find container \"9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4\": container with ID starting with 9fc369b132aace7e2a839d152ecc2e9b1c68ca5327becce3477bc498d8860cc4 not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.443801 5008 scope.go:117] "RemoveContainer" containerID="12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.444403 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf"} err="failed to get container status \"12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf\": rpc error: code = NotFound desc = could not find container \"12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf\": container with ID starting with 12e0a095c039a3d17aa0f0c47b6b71b7eca88097f5de39666eba7467b9d398cf not found: ID does not exist" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.534114 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" path="/var/lib/kubelet/pods/21ce0079-9cf2-4a67-b731-fc1b02efaf3c/volumes" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.535301 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a5d6ee-2595-4413-95be-b4c4656e978d" path="/var/lib/kubelet/pods/a1a5d6ee-2595-4413-95be-b4c4656e978d/volumes" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.536136 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd420810-75bc-480d-82c6-7d5df62216a4" path="/var/lib/kubelet/pods/bd420810-75bc-480d-82c6-7d5df62216a4/volumes" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.537381 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa146f2c-c806-443d-b49c-289beed985a6" path="/var/lib/kubelet/pods/fa146f2c-c806-443d-b49c-289beed985a6/volumes" Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.894140 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.894459 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerName="glance-log" containerID="cri-o://d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577" gracePeriod=30 Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.894537 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerName="glance-api" containerID="cri-o://443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d" gracePeriod=30 Nov 26 22:59:11 crc kubenswrapper[5008]: I1126 22:59:11.894537 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerName="glance-httpd" containerID="cri-o://d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39" gracePeriod=30 Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.200854 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.201413 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="e94f5176-d76e-496c-b44e-7b167a690016" containerName="glance-httpd" containerID="cri-o://8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138" gracePeriod=30 Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.201436 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="e94f5176-d76e-496c-b44e-7b167a690016" containerName="glance-api" containerID="cri-o://6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258" gracePeriod=30 Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.201333 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="e94f5176-d76e-496c-b44e-7b167a690016" containerName="glance-log" containerID="cri-o://7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721" gracePeriod=30 Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.679486 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.848926 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-logs\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.848986 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-etc-iscsi\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849006 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849032 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-config-data\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849060 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-run\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849075 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-sys\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849108 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlgjv\" (UniqueName: \"kubernetes.io/projected/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-kube-api-access-xlgjv\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849126 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-dev\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849123 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849152 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-var-locks-brick\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849167 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-run" (OuterVolumeSpecName: "run") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849197 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849205 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-dev" (OuterVolumeSpecName: "dev") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849218 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-scripts\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849246 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-sys" (OuterVolumeSpecName: "sys") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849278 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849276 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-lib-modules\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849325 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849368 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-etc-nvme\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849438 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-httpd-run\") pod \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\" (UID: \"3236fda8-50c3-4d65-9c07-3f9817cc3a9f\") " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849518 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849985 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849998 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.850008 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.850017 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.850025 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-sys\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.850033 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-dev\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.850040 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.849985 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-logs" (OuterVolumeSpecName: "logs") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.850074 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.857996 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.873174 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-scripts" (OuterVolumeSpecName: "scripts") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.873305 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-kube-api-access-xlgjv" (OuterVolumeSpecName: "kube-api-access-xlgjv") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "kube-api-access-xlgjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.873359 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance-cache") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.947502 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-config-data" (OuterVolumeSpecName: "config-data") pod "3236fda8-50c3-4d65-9c07-3f9817cc3a9f" (UID: "3236fda8-50c3-4d65-9c07-3f9817cc3a9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.951359 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.951388 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.951399 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.951408 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-logs\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.951428 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.951437 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.951446 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlgjv\" (UniqueName: \"kubernetes.io/projected/3236fda8-50c3-4d65-9c07-3f9817cc3a9f-kube-api-access-xlgjv\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.967660 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.968031 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 26 22:59:12 crc kubenswrapper[5008]: I1126 22:59:12.971382 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.030648 5008 generic.go:334] "Generic (PLEG): container finished" podID="e94f5176-d76e-496c-b44e-7b167a690016" containerID="6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258" exitCode=0 Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.030849 5008 generic.go:334] "Generic (PLEG): container finished" podID="e94f5176-d76e-496c-b44e-7b167a690016" containerID="8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138" exitCode=0 Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.030906 5008 generic.go:334] "Generic (PLEG): container finished" podID="e94f5176-d76e-496c-b44e-7b167a690016" containerID="7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721" exitCode=143 Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.031005 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e94f5176-d76e-496c-b44e-7b167a690016","Type":"ContainerDied","Data":"6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258"} Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.031106 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e94f5176-d76e-496c-b44e-7b167a690016","Type":"ContainerDied","Data":"8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138"} Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.031165 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e94f5176-d76e-496c-b44e-7b167a690016","Type":"ContainerDied","Data":"7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721"} Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.031219 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e94f5176-d76e-496c-b44e-7b167a690016","Type":"ContainerDied","Data":"af6e260439f078604aef07fb4e58f26fd6be15fed2a2d02ce729b51f700c89b0"} Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.031278 5008 scope.go:117] "RemoveContainer" containerID="6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.031444 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.035300 5008 generic.go:334] "Generic (PLEG): container finished" podID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerID="443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d" exitCode=0 Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.035332 5008 generic.go:334] "Generic (PLEG): container finished" podID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerID="d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39" exitCode=0 Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.035340 5008 generic.go:334] "Generic (PLEG): container finished" podID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerID="d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577" exitCode=143 Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.035437 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.035437 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3236fda8-50c3-4d65-9c07-3f9817cc3a9f","Type":"ContainerDied","Data":"443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d"} Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.035648 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3236fda8-50c3-4d65-9c07-3f9817cc3a9f","Type":"ContainerDied","Data":"d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39"} Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.035714 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3236fda8-50c3-4d65-9c07-3f9817cc3a9f","Type":"ContainerDied","Data":"d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577"} Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.035775 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3236fda8-50c3-4d65-9c07-3f9817cc3a9f","Type":"ContainerDied","Data":"bc8d5514c192c9c0e364941828da412947efe2cb86ccc767040f729ab241a100"} Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.049000 5008 scope.go:117] "RemoveContainer" containerID="8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.051941 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-dev\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.051998 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052035 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94f5176-d76e-496c-b44e-7b167a690016-scripts\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052052 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-dev" (OuterVolumeSpecName: "dev") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052067 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-etc-nvme\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052121 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-sys\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052181 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-var-locks-brick\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052217 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-run\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052244 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-lib-modules\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052266 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94f5176-d76e-496c-b44e-7b167a690016-logs\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052283 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-etc-iscsi\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052312 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94f5176-d76e-496c-b44e-7b167a690016-config-data\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052338 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g98w\" (UniqueName: \"kubernetes.io/projected/e94f5176-d76e-496c-b44e-7b167a690016-kube-api-access-8g98w\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052362 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e94f5176-d76e-496c-b44e-7b167a690016-httpd-run\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052404 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e94f5176-d76e-496c-b44e-7b167a690016\" (UID: \"e94f5176-d76e-496c-b44e-7b167a690016\") " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052674 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052692 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052702 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-dev\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052240 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-sys" (OuterVolumeSpecName: "sys") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052484 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052917 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94f5176-d76e-496c-b44e-7b167a690016-logs" (OuterVolumeSpecName: "logs") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.053343 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052938 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.052954 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-run" (OuterVolumeSpecName: "run") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.053000 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.053513 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94f5176-d76e-496c-b44e-7b167a690016-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.054479 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94f5176-d76e-496c-b44e-7b167a690016-scripts" (OuterVolumeSpecName: "scripts") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.055039 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.063305 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.069201 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e94f5176-d76e-496c-b44e-7b167a690016-kube-api-access-8g98w" (OuterVolumeSpecName: "kube-api-access-8g98w") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "kube-api-access-8g98w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.074736 5008 scope.go:117] "RemoveContainer" containerID="7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.076256 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.081853 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.102209 5008 scope.go:117] "RemoveContainer" containerID="6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258" Nov 26 22:59:13 crc kubenswrapper[5008]: E1126 22:59:13.102699 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258\": container with ID starting with 6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258 not found: ID does not exist" containerID="6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.102739 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258"} err="failed to get container status \"6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258\": rpc error: code = NotFound desc = could not find container \"6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258\": container with ID starting with 6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.102766 5008 scope.go:117] "RemoveContainer" containerID="8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138" Nov 26 22:59:13 crc kubenswrapper[5008]: E1126 22:59:13.103278 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138\": container with ID starting with 8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138 not found: ID does not exist" containerID="8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.103315 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138"} err="failed to get container status \"8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138\": rpc error: code = NotFound desc = could not find container \"8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138\": container with ID starting with 8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.103336 5008 scope.go:117] "RemoveContainer" containerID="7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721" Nov 26 22:59:13 crc kubenswrapper[5008]: E1126 22:59:13.103578 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721\": container with ID starting with 7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721 not found: ID does not exist" containerID="7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.103610 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721"} err="failed to get container status \"7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721\": rpc error: code = NotFound desc = could not find container \"7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721\": container with ID starting with 7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.103631 5008 scope.go:117] "RemoveContainer" containerID="6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.103855 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258"} err="failed to get container status \"6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258\": rpc error: code = NotFound desc = could not find container \"6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258\": container with ID starting with 6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.103888 5008 scope.go:117] "RemoveContainer" containerID="8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.104097 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138"} err="failed to get container status \"8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138\": rpc error: code = NotFound desc = could not find container \"8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138\": container with ID starting with 8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.104118 5008 scope.go:117] "RemoveContainer" containerID="7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.104373 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721"} err="failed to get container status \"7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721\": rpc error: code = NotFound desc = could not find container \"7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721\": container with ID starting with 7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.104391 5008 scope.go:117] "RemoveContainer" containerID="6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.104573 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258"} err="failed to get container status \"6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258\": rpc error: code = NotFound desc = could not find container \"6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258\": container with ID starting with 6ddcb6543874f2d56796d35e4bdc3c43da0ee5bae0d34ce72f0ff808da192258 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.104594 5008 scope.go:117] "RemoveContainer" containerID="8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.104796 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138"} err="failed to get container status \"8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138\": rpc error: code = NotFound desc = could not find container \"8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138\": container with ID starting with 8288a37afead6aeae8bf69caefafe676f04dcd91a5d85e86cc7f189cfd93c138 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.104823 5008 scope.go:117] "RemoveContainer" containerID="7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.105018 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721"} err="failed to get container status \"7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721\": rpc error: code = NotFound desc = could not find container \"7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721\": container with ID starting with 7e2ea57d07a4dbe77cf418a1fe997804cded7a80fba443d1ba59f262a30c1721 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.105034 5008 scope.go:117] "RemoveContainer" containerID="443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.118258 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94f5176-d76e-496c-b44e-7b167a690016-config-data" (OuterVolumeSpecName: "config-data") pod "e94f5176-d76e-496c-b44e-7b167a690016" (UID: "e94f5176-d76e-496c-b44e-7b167a690016"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.153872 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.153904 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.153914 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94f5176-d76e-496c-b44e-7b167a690016-logs\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.153922 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.153930 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94f5176-d76e-496c-b44e-7b167a690016-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.153941 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g98w\" (UniqueName: \"kubernetes.io/projected/e94f5176-d76e-496c-b44e-7b167a690016-kube-api-access-8g98w\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.153949 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e94f5176-d76e-496c-b44e-7b167a690016-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.154011 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.154030 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.154040 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94f5176-d76e-496c-b44e-7b167a690016-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.154048 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.165165 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-sys\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.165179 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e94f5176-d76e-496c-b44e-7b167a690016-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.166473 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.170535 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.182335 5008 scope.go:117] "RemoveContainer" containerID="d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.203439 5008 scope.go:117] "RemoveContainer" containerID="d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.225660 5008 scope.go:117] "RemoveContainer" containerID="443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d" Nov 26 22:59:13 crc kubenswrapper[5008]: E1126 22:59:13.226010 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d\": container with ID starting with 443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d not found: ID does not exist" containerID="443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.226075 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d"} err="failed to get container status \"443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d\": rpc error: code = NotFound desc = could not find container \"443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d\": container with ID starting with 443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.226098 5008 scope.go:117] "RemoveContainer" containerID="d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39" Nov 26 22:59:13 crc kubenswrapper[5008]: E1126 22:59:13.226440 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39\": container with ID starting with d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39 not found: ID does not exist" containerID="d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.226485 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39"} err="failed to get container status \"d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39\": rpc error: code = NotFound desc = could not find container \"d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39\": container with ID starting with d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.226499 5008 scope.go:117] "RemoveContainer" containerID="d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577" Nov 26 22:59:13 crc kubenswrapper[5008]: E1126 22:59:13.226754 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577\": container with ID starting with d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577 not found: ID does not exist" containerID="d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.226776 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577"} err="failed to get container status \"d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577\": rpc error: code = NotFound desc = could not find container \"d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577\": container with ID starting with d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.226816 5008 scope.go:117] "RemoveContainer" containerID="443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.227010 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d"} err="failed to get container status \"443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d\": rpc error: code = NotFound desc = could not find container \"443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d\": container with ID starting with 443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.227087 5008 scope.go:117] "RemoveContainer" containerID="d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.227606 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39"} err="failed to get container status \"d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39\": rpc error: code = NotFound desc = could not find container \"d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39\": container with ID starting with d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.227635 5008 scope.go:117] "RemoveContainer" containerID="d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.227850 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577"} err="failed to get container status \"d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577\": rpc error: code = NotFound desc = could not find container \"d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577\": container with ID starting with d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.227869 5008 scope.go:117] "RemoveContainer" containerID="443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.228069 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d"} err="failed to get container status \"443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d\": rpc error: code = NotFound desc = could not find container \"443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d\": container with ID starting with 443c0cc71a1501cef0f074ce1289477d188dde96fca7807219705e986d86df6d not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.228088 5008 scope.go:117] "RemoveContainer" containerID="d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.228255 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39"} err="failed to get container status \"d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39\": rpc error: code = NotFound desc = could not find container \"d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39\": container with ID starting with d63111b6f413234f7f85d8c422bd41fea119ede36e182a8a216aeba094ebbd39 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.228273 5008 scope.go:117] "RemoveContainer" containerID="d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.228455 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577"} err="failed to get container status \"d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577\": rpc error: code = NotFound desc = could not find container \"d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577\": container with ID starting with d485635f4c72460466b09440047bb99231ccd35d434f3f5954c469cbdc5aa577 not found: ID does not exist" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.267247 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.267276 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.366162 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.373901 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.530522 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" path="/var/lib/kubelet/pods/3236fda8-50c3-4d65-9c07-3f9817cc3a9f/volumes" Nov 26 22:59:13 crc kubenswrapper[5008]: I1126 22:59:13.531555 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e94f5176-d76e-496c-b44e-7b167a690016" path="/var/lib/kubelet/pods/e94f5176-d76e-496c-b44e-7b167a690016/volumes" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.403566 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ntc6s"] Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.408208 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ntc6s"] Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441011 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance0d4d-account-delete-xwvl7"] Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441365 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94f5176-d76e-496c-b44e-7b167a690016" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441393 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94f5176-d76e-496c-b44e-7b167a690016" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441430 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441441 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441464 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd420810-75bc-480d-82c6-7d5df62216a4" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441474 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd420810-75bc-480d-82c6-7d5df62216a4" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441487 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441497 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441521 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa146f2c-c806-443d-b49c-289beed985a6" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441531 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa146f2c-c806-443d-b49c-289beed985a6" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441544 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd420810-75bc-480d-82c6-7d5df62216a4" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441554 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd420810-75bc-480d-82c6-7d5df62216a4" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441566 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94f5176-d76e-496c-b44e-7b167a690016" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441575 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94f5176-d76e-496c-b44e-7b167a690016" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441596 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441605 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441619 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441627 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441647 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441658 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441672 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa146f2c-c806-443d-b49c-289beed985a6" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441682 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa146f2c-c806-443d-b49c-289beed985a6" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441702 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd420810-75bc-480d-82c6-7d5df62216a4" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441712 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd420810-75bc-480d-82c6-7d5df62216a4" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441730 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94f5176-d76e-496c-b44e-7b167a690016" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441740 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94f5176-d76e-496c-b44e-7b167a690016" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441756 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441766 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441778 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa146f2c-c806-443d-b49c-289beed985a6" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441787 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa146f2c-c806-443d-b49c-289beed985a6" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441803 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441813 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441833 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441843 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: E1126 22:59:14.441864 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.441874 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442106 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442128 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94f5176-d76e-496c-b44e-7b167a690016" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442144 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442163 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94f5176-d76e-496c-b44e-7b167a690016" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442181 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa146f2c-c806-443d-b49c-289beed985a6" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442197 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa146f2c-c806-443d-b49c-289beed985a6" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442215 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442229 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442244 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa146f2c-c806-443d-b49c-289beed985a6" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442260 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd420810-75bc-480d-82c6-7d5df62216a4" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442272 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442291 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="3236fda8-50c3-4d65-9c07-3f9817cc3a9f" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442303 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94f5176-d76e-496c-b44e-7b167a690016" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442315 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd420810-75bc-480d-82c6-7d5df62216a4" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442327 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerName="glance-log" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442338 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a5d6ee-2595-4413-95be-b4c4656e978d" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442349 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd420810-75bc-480d-82c6-7d5df62216a4" containerName="glance-api" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442362 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ce0079-9cf2-4a67-b731-fc1b02efaf3c" containerName="glance-httpd" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.442883 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance0d4d-account-delete-xwvl7" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.452790 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance0d4d-account-delete-xwvl7"] Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.589714 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7899db1e-3af6-42aa-ac74-4ca516bb353c-operator-scripts\") pod \"glance0d4d-account-delete-xwvl7\" (UID: \"7899db1e-3af6-42aa-ac74-4ca516bb353c\") " pod="glance-kuttl-tests/glance0d4d-account-delete-xwvl7" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.589899 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpj8g\" (UniqueName: \"kubernetes.io/projected/7899db1e-3af6-42aa-ac74-4ca516bb353c-kube-api-access-gpj8g\") pod \"glance0d4d-account-delete-xwvl7\" (UID: \"7899db1e-3af6-42aa-ac74-4ca516bb353c\") " pod="glance-kuttl-tests/glance0d4d-account-delete-xwvl7" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.691624 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpj8g\" (UniqueName: \"kubernetes.io/projected/7899db1e-3af6-42aa-ac74-4ca516bb353c-kube-api-access-gpj8g\") pod \"glance0d4d-account-delete-xwvl7\" (UID: \"7899db1e-3af6-42aa-ac74-4ca516bb353c\") " pod="glance-kuttl-tests/glance0d4d-account-delete-xwvl7" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.691719 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7899db1e-3af6-42aa-ac74-4ca516bb353c-operator-scripts\") pod \"glance0d4d-account-delete-xwvl7\" (UID: \"7899db1e-3af6-42aa-ac74-4ca516bb353c\") " pod="glance-kuttl-tests/glance0d4d-account-delete-xwvl7" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.692443 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7899db1e-3af6-42aa-ac74-4ca516bb353c-operator-scripts\") pod \"glance0d4d-account-delete-xwvl7\" (UID: \"7899db1e-3af6-42aa-ac74-4ca516bb353c\") " pod="glance-kuttl-tests/glance0d4d-account-delete-xwvl7" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.712461 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpj8g\" (UniqueName: \"kubernetes.io/projected/7899db1e-3af6-42aa-ac74-4ca516bb353c-kube-api-access-gpj8g\") pod \"glance0d4d-account-delete-xwvl7\" (UID: \"7899db1e-3af6-42aa-ac74-4ca516bb353c\") " pod="glance-kuttl-tests/glance0d4d-account-delete-xwvl7" Nov 26 22:59:14 crc kubenswrapper[5008]: I1126 22:59:14.764475 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance0d4d-account-delete-xwvl7" Nov 26 22:59:15 crc kubenswrapper[5008]: I1126 22:59:15.068379 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance0d4d-account-delete-xwvl7"] Nov 26 22:59:15 crc kubenswrapper[5008]: I1126 22:59:15.528362 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f02c876-b9c0-4f21-9e7c-678d0a032155" path="/var/lib/kubelet/pods/6f02c876-b9c0-4f21-9e7c-678d0a032155/volumes" Nov 26 22:59:16 crc kubenswrapper[5008]: I1126 22:59:16.067206 5008 generic.go:334] "Generic (PLEG): container finished" podID="7899db1e-3af6-42aa-ac74-4ca516bb353c" containerID="b165c6a3d354a2a830ea8c450c759e46d8f2968e992166b5070f44199ffe9099" exitCode=0 Nov 26 22:59:16 crc kubenswrapper[5008]: I1126 22:59:16.067258 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance0d4d-account-delete-xwvl7" event={"ID":"7899db1e-3af6-42aa-ac74-4ca516bb353c","Type":"ContainerDied","Data":"b165c6a3d354a2a830ea8c450c759e46d8f2968e992166b5070f44199ffe9099"} Nov 26 22:59:16 crc kubenswrapper[5008]: I1126 22:59:16.067293 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance0d4d-account-delete-xwvl7" event={"ID":"7899db1e-3af6-42aa-ac74-4ca516bb353c","Type":"ContainerStarted","Data":"980f0cef60557b7e46848afb66b16a5d1fd0bc8d243a971fef19ce740f30679e"} Nov 26 22:59:17 crc kubenswrapper[5008]: I1126 22:59:17.456438 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance0d4d-account-delete-xwvl7" Nov 26 22:59:17 crc kubenswrapper[5008]: I1126 22:59:17.530587 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7899db1e-3af6-42aa-ac74-4ca516bb353c-operator-scripts\") pod \"7899db1e-3af6-42aa-ac74-4ca516bb353c\" (UID: \"7899db1e-3af6-42aa-ac74-4ca516bb353c\") " Nov 26 22:59:17 crc kubenswrapper[5008]: I1126 22:59:17.530667 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpj8g\" (UniqueName: \"kubernetes.io/projected/7899db1e-3af6-42aa-ac74-4ca516bb353c-kube-api-access-gpj8g\") pod \"7899db1e-3af6-42aa-ac74-4ca516bb353c\" (UID: \"7899db1e-3af6-42aa-ac74-4ca516bb353c\") " Nov 26 22:59:17 crc kubenswrapper[5008]: I1126 22:59:17.531193 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7899db1e-3af6-42aa-ac74-4ca516bb353c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7899db1e-3af6-42aa-ac74-4ca516bb353c" (UID: "7899db1e-3af6-42aa-ac74-4ca516bb353c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:59:17 crc kubenswrapper[5008]: I1126 22:59:17.536848 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7899db1e-3af6-42aa-ac74-4ca516bb353c-kube-api-access-gpj8g" (OuterVolumeSpecName: "kube-api-access-gpj8g") pod "7899db1e-3af6-42aa-ac74-4ca516bb353c" (UID: "7899db1e-3af6-42aa-ac74-4ca516bb353c"). InnerVolumeSpecName "kube-api-access-gpj8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:59:17 crc kubenswrapper[5008]: I1126 22:59:17.632164 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7899db1e-3af6-42aa-ac74-4ca516bb353c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:17 crc kubenswrapper[5008]: I1126 22:59:17.632218 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpj8g\" (UniqueName: \"kubernetes.io/projected/7899db1e-3af6-42aa-ac74-4ca516bb353c-kube-api-access-gpj8g\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:18 crc kubenswrapper[5008]: I1126 22:59:18.101159 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance0d4d-account-delete-xwvl7" event={"ID":"7899db1e-3af6-42aa-ac74-4ca516bb353c","Type":"ContainerDied","Data":"980f0cef60557b7e46848afb66b16a5d1fd0bc8d243a971fef19ce740f30679e"} Nov 26 22:59:18 crc kubenswrapper[5008]: I1126 22:59:18.102038 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="980f0cef60557b7e46848afb66b16a5d1fd0bc8d243a971fef19ce740f30679e" Nov 26 22:59:18 crc kubenswrapper[5008]: I1126 22:59:18.101265 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance0d4d-account-delete-xwvl7" Nov 26 22:59:19 crc kubenswrapper[5008]: I1126 22:59:19.551298 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-h6dk7"] Nov 26 22:59:19 crc kubenswrapper[5008]: I1126 22:59:19.551369 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-h6dk7"] Nov 26 22:59:19 crc kubenswrapper[5008]: I1126 22:59:19.551394 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance0d4d-account-delete-xwvl7"] Nov 26 22:59:19 crc kubenswrapper[5008]: I1126 22:59:19.553158 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance0d4d-account-delete-xwvl7"] Nov 26 22:59:19 crc kubenswrapper[5008]: I1126 22:59:19.557598 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc"] Nov 26 22:59:19 crc kubenswrapper[5008]: I1126 22:59:19.563096 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-0d4d-account-create-update-q5hsc"] Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.009069 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-584km"] Nov 26 22:59:20 crc kubenswrapper[5008]: E1126 22:59:20.010044 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7899db1e-3af6-42aa-ac74-4ca516bb353c" containerName="mariadb-account-delete" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.010086 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="7899db1e-3af6-42aa-ac74-4ca516bb353c" containerName="mariadb-account-delete" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.010447 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="7899db1e-3af6-42aa-ac74-4ca516bb353c" containerName="mariadb-account-delete" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.011530 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-584km" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.025197 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-584km"] Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.036536 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-6893-account-create-update-25vzs"] Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.038055 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.042100 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.044110 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-6893-account-create-update-25vzs"] Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.190263 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ca02e5f-27d8-4215-bb7c-491373544006-operator-scripts\") pod \"glance-6893-account-create-update-25vzs\" (UID: \"4ca02e5f-27d8-4215-bb7c-491373544006\") " pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.190515 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx6f6\" (UniqueName: \"kubernetes.io/projected/4ca02e5f-27d8-4215-bb7c-491373544006-kube-api-access-jx6f6\") pod \"glance-6893-account-create-update-25vzs\" (UID: \"4ca02e5f-27d8-4215-bb7c-491373544006\") " pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.190590 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a105ca6d-90e5-4f65-959e-59888265f31c-operator-scripts\") pod \"glance-db-create-584km\" (UID: \"a105ca6d-90e5-4f65-959e-59888265f31c\") " pod="glance-kuttl-tests/glance-db-create-584km" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.190774 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc4bt\" (UniqueName: \"kubernetes.io/projected/a105ca6d-90e5-4f65-959e-59888265f31c-kube-api-access-cc4bt\") pod \"glance-db-create-584km\" (UID: \"a105ca6d-90e5-4f65-959e-59888265f31c\") " pod="glance-kuttl-tests/glance-db-create-584km" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.292064 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ca02e5f-27d8-4215-bb7c-491373544006-operator-scripts\") pod \"glance-6893-account-create-update-25vzs\" (UID: \"4ca02e5f-27d8-4215-bb7c-491373544006\") " pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.292125 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx6f6\" (UniqueName: \"kubernetes.io/projected/4ca02e5f-27d8-4215-bb7c-491373544006-kube-api-access-jx6f6\") pod \"glance-6893-account-create-update-25vzs\" (UID: \"4ca02e5f-27d8-4215-bb7c-491373544006\") " pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.292143 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a105ca6d-90e5-4f65-959e-59888265f31c-operator-scripts\") pod \"glance-db-create-584km\" (UID: \"a105ca6d-90e5-4f65-959e-59888265f31c\") " pod="glance-kuttl-tests/glance-db-create-584km" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.292179 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc4bt\" (UniqueName: \"kubernetes.io/projected/a105ca6d-90e5-4f65-959e-59888265f31c-kube-api-access-cc4bt\") pod \"glance-db-create-584km\" (UID: \"a105ca6d-90e5-4f65-959e-59888265f31c\") " pod="glance-kuttl-tests/glance-db-create-584km" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.293154 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a105ca6d-90e5-4f65-959e-59888265f31c-operator-scripts\") pod \"glance-db-create-584km\" (UID: \"a105ca6d-90e5-4f65-959e-59888265f31c\") " pod="glance-kuttl-tests/glance-db-create-584km" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.293259 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ca02e5f-27d8-4215-bb7c-491373544006-operator-scripts\") pod \"glance-6893-account-create-update-25vzs\" (UID: \"4ca02e5f-27d8-4215-bb7c-491373544006\") " pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.323770 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc4bt\" (UniqueName: \"kubernetes.io/projected/a105ca6d-90e5-4f65-959e-59888265f31c-kube-api-access-cc4bt\") pod \"glance-db-create-584km\" (UID: \"a105ca6d-90e5-4f65-959e-59888265f31c\") " pod="glance-kuttl-tests/glance-db-create-584km" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.330669 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx6f6\" (UniqueName: \"kubernetes.io/projected/4ca02e5f-27d8-4215-bb7c-491373544006-kube-api-access-jx6f6\") pod \"glance-6893-account-create-update-25vzs\" (UID: \"4ca02e5f-27d8-4215-bb7c-491373544006\") " pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.340689 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-584km" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.368373 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.652827 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-584km"] Nov 26 22:59:20 crc kubenswrapper[5008]: I1126 22:59:20.876373 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-6893-account-create-update-25vzs"] Nov 26 22:59:20 crc kubenswrapper[5008]: W1126 22:59:20.879780 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ca02e5f_27d8_4215_bb7c_491373544006.slice/crio-d6a6841a66f922f2afba2fdd5b6a1ece860b70abcf9031a2527ca2c31e4ed1ce WatchSource:0}: Error finding container d6a6841a66f922f2afba2fdd5b6a1ece860b70abcf9031a2527ca2c31e4ed1ce: Status 404 returned error can't find the container with id d6a6841a66f922f2afba2fdd5b6a1ece860b70abcf9031a2527ca2c31e4ed1ce Nov 26 22:59:21 crc kubenswrapper[5008]: I1126 22:59:21.128812 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" event={"ID":"4ca02e5f-27d8-4215-bb7c-491373544006","Type":"ContainerStarted","Data":"42daf4284f1c2f283c8c41080c25acabb85b380c99a6177a7f8a7e22e6914b60"} Nov 26 22:59:21 crc kubenswrapper[5008]: I1126 22:59:21.128849 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" event={"ID":"4ca02e5f-27d8-4215-bb7c-491373544006","Type":"ContainerStarted","Data":"d6a6841a66f922f2afba2fdd5b6a1ece860b70abcf9031a2527ca2c31e4ed1ce"} Nov 26 22:59:21 crc kubenswrapper[5008]: I1126 22:59:21.131221 5008 generic.go:334] "Generic (PLEG): container finished" podID="a105ca6d-90e5-4f65-959e-59888265f31c" containerID="3e0c39e8307720e3021e6848f04c50009efdd2e1f977a9dd7345af7f3d10724c" exitCode=0 Nov 26 22:59:21 crc kubenswrapper[5008]: I1126 22:59:21.131270 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-584km" event={"ID":"a105ca6d-90e5-4f65-959e-59888265f31c","Type":"ContainerDied","Data":"3e0c39e8307720e3021e6848f04c50009efdd2e1f977a9dd7345af7f3d10724c"} Nov 26 22:59:21 crc kubenswrapper[5008]: I1126 22:59:21.131298 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-584km" event={"ID":"a105ca6d-90e5-4f65-959e-59888265f31c","Type":"ContainerStarted","Data":"3354f7ca0e7c6c29574e8634c4c073580ef4aa264ad49c7bd6f961bfc67c1f59"} Nov 26 22:59:21 crc kubenswrapper[5008]: I1126 22:59:21.145342 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" podStartSLOduration=2.145323989 podStartE2EDuration="2.145323989s" podCreationTimestamp="2025-11-26 22:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:59:21.142423948 +0000 UTC m=+1236.555117960" watchObservedRunningTime="2025-11-26 22:59:21.145323989 +0000 UTC m=+1236.558017991" Nov 26 22:59:21 crc kubenswrapper[5008]: I1126 22:59:21.531587 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22e07121-401d-4a21-b04c-baf4f8d7e8f2" path="/var/lib/kubelet/pods/22e07121-401d-4a21-b04c-baf4f8d7e8f2/volumes" Nov 26 22:59:21 crc kubenswrapper[5008]: I1126 22:59:21.533058 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7899db1e-3af6-42aa-ac74-4ca516bb353c" path="/var/lib/kubelet/pods/7899db1e-3af6-42aa-ac74-4ca516bb353c/volumes" Nov 26 22:59:21 crc kubenswrapper[5008]: I1126 22:59:21.534243 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d62c944d-3e89-4287-ad8f-98cef6afa353" path="/var/lib/kubelet/pods/d62c944d-3e89-4287-ad8f-98cef6afa353/volumes" Nov 26 22:59:22 crc kubenswrapper[5008]: I1126 22:59:22.144147 5008 generic.go:334] "Generic (PLEG): container finished" podID="4ca02e5f-27d8-4215-bb7c-491373544006" containerID="42daf4284f1c2f283c8c41080c25acabb85b380c99a6177a7f8a7e22e6914b60" exitCode=0 Nov 26 22:59:22 crc kubenswrapper[5008]: I1126 22:59:22.144215 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" event={"ID":"4ca02e5f-27d8-4215-bb7c-491373544006","Type":"ContainerDied","Data":"42daf4284f1c2f283c8c41080c25acabb85b380c99a6177a7f8a7e22e6914b60"} Nov 26 22:59:22 crc kubenswrapper[5008]: I1126 22:59:22.517125 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-584km" Nov 26 22:59:22 crc kubenswrapper[5008]: I1126 22:59:22.636609 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a105ca6d-90e5-4f65-959e-59888265f31c-operator-scripts\") pod \"a105ca6d-90e5-4f65-959e-59888265f31c\" (UID: \"a105ca6d-90e5-4f65-959e-59888265f31c\") " Nov 26 22:59:22 crc kubenswrapper[5008]: I1126 22:59:22.636960 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc4bt\" (UniqueName: \"kubernetes.io/projected/a105ca6d-90e5-4f65-959e-59888265f31c-kube-api-access-cc4bt\") pod \"a105ca6d-90e5-4f65-959e-59888265f31c\" (UID: \"a105ca6d-90e5-4f65-959e-59888265f31c\") " Nov 26 22:59:22 crc kubenswrapper[5008]: I1126 22:59:22.638110 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a105ca6d-90e5-4f65-959e-59888265f31c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a105ca6d-90e5-4f65-959e-59888265f31c" (UID: "a105ca6d-90e5-4f65-959e-59888265f31c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:59:22 crc kubenswrapper[5008]: I1126 22:59:22.643250 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a105ca6d-90e5-4f65-959e-59888265f31c-kube-api-access-cc4bt" (OuterVolumeSpecName: "kube-api-access-cc4bt") pod "a105ca6d-90e5-4f65-959e-59888265f31c" (UID: "a105ca6d-90e5-4f65-959e-59888265f31c"). InnerVolumeSpecName "kube-api-access-cc4bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:59:22 crc kubenswrapper[5008]: I1126 22:59:22.739293 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a105ca6d-90e5-4f65-959e-59888265f31c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:22 crc kubenswrapper[5008]: I1126 22:59:22.739338 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc4bt\" (UniqueName: \"kubernetes.io/projected/a105ca6d-90e5-4f65-959e-59888265f31c-kube-api-access-cc4bt\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:23 crc kubenswrapper[5008]: I1126 22:59:23.187313 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-584km" event={"ID":"a105ca6d-90e5-4f65-959e-59888265f31c","Type":"ContainerDied","Data":"3354f7ca0e7c6c29574e8634c4c073580ef4aa264ad49c7bd6f961bfc67c1f59"} Nov 26 22:59:23 crc kubenswrapper[5008]: I1126 22:59:23.189364 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3354f7ca0e7c6c29574e8634c4c073580ef4aa264ad49c7bd6f961bfc67c1f59" Nov 26 22:59:23 crc kubenswrapper[5008]: I1126 22:59:23.187347 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-584km" Nov 26 22:59:23 crc kubenswrapper[5008]: I1126 22:59:23.568808 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" Nov 26 22:59:23 crc kubenswrapper[5008]: I1126 22:59:23.659409 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx6f6\" (UniqueName: \"kubernetes.io/projected/4ca02e5f-27d8-4215-bb7c-491373544006-kube-api-access-jx6f6\") pod \"4ca02e5f-27d8-4215-bb7c-491373544006\" (UID: \"4ca02e5f-27d8-4215-bb7c-491373544006\") " Nov 26 22:59:23 crc kubenswrapper[5008]: I1126 22:59:23.659952 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ca02e5f-27d8-4215-bb7c-491373544006-operator-scripts\") pod \"4ca02e5f-27d8-4215-bb7c-491373544006\" (UID: \"4ca02e5f-27d8-4215-bb7c-491373544006\") " Nov 26 22:59:23 crc kubenswrapper[5008]: I1126 22:59:23.660847 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca02e5f-27d8-4215-bb7c-491373544006-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ca02e5f-27d8-4215-bb7c-491373544006" (UID: "4ca02e5f-27d8-4215-bb7c-491373544006"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 22:59:23 crc kubenswrapper[5008]: I1126 22:59:23.663841 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca02e5f-27d8-4215-bb7c-491373544006-kube-api-access-jx6f6" (OuterVolumeSpecName: "kube-api-access-jx6f6") pod "4ca02e5f-27d8-4215-bb7c-491373544006" (UID: "4ca02e5f-27d8-4215-bb7c-491373544006"). InnerVolumeSpecName "kube-api-access-jx6f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:59:23 crc kubenswrapper[5008]: I1126 22:59:23.761581 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ca02e5f-27d8-4215-bb7c-491373544006-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:23 crc kubenswrapper[5008]: I1126 22:59:23.761629 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx6f6\" (UniqueName: \"kubernetes.io/projected/4ca02e5f-27d8-4215-bb7c-491373544006-kube-api-access-jx6f6\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:24 crc kubenswrapper[5008]: I1126 22:59:24.200002 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" event={"ID":"4ca02e5f-27d8-4215-bb7c-491373544006","Type":"ContainerDied","Data":"d6a6841a66f922f2afba2fdd5b6a1ece860b70abcf9031a2527ca2c31e4ed1ce"} Nov 26 22:59:24 crc kubenswrapper[5008]: I1126 22:59:24.200049 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-6893-account-create-update-25vzs" Nov 26 22:59:24 crc kubenswrapper[5008]: I1126 22:59:24.200059 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6a6841a66f922f2afba2fdd5b6a1ece860b70abcf9031a2527ca2c31e4ed1ce" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.167588 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-gzdw5"] Nov 26 22:59:25 crc kubenswrapper[5008]: E1126 22:59:25.167904 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a105ca6d-90e5-4f65-959e-59888265f31c" containerName="mariadb-database-create" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.167921 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a105ca6d-90e5-4f65-959e-59888265f31c" containerName="mariadb-database-create" Nov 26 22:59:25 crc kubenswrapper[5008]: E1126 22:59:25.167948 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca02e5f-27d8-4215-bb7c-491373544006" containerName="mariadb-account-create-update" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.167957 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca02e5f-27d8-4215-bb7c-491373544006" containerName="mariadb-account-create-update" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.168139 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a105ca6d-90e5-4f65-959e-59888265f31c" containerName="mariadb-database-create" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.168160 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ca02e5f-27d8-4215-bb7c-491373544006" containerName="mariadb-account-create-update" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.168767 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-gzdw5" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.171795 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-p9xcr" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.172511 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.185325 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/580f0ae9-1761-4d22-b98b-bd4af7c2752f-db-sync-config-data\") pod \"glance-db-sync-gzdw5\" (UID: \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\") " pod="glance-kuttl-tests/glance-db-sync-gzdw5" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.185410 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m99cx\" (UniqueName: \"kubernetes.io/projected/580f0ae9-1761-4d22-b98b-bd4af7c2752f-kube-api-access-m99cx\") pod \"glance-db-sync-gzdw5\" (UID: \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\") " pod="glance-kuttl-tests/glance-db-sync-gzdw5" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.185487 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580f0ae9-1761-4d22-b98b-bd4af7c2752f-config-data\") pod \"glance-db-sync-gzdw5\" (UID: \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\") " pod="glance-kuttl-tests/glance-db-sync-gzdw5" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.223118 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-gzdw5"] Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.287779 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580f0ae9-1761-4d22-b98b-bd4af7c2752f-config-data\") pod \"glance-db-sync-gzdw5\" (UID: \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\") " pod="glance-kuttl-tests/glance-db-sync-gzdw5" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.287989 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/580f0ae9-1761-4d22-b98b-bd4af7c2752f-db-sync-config-data\") pod \"glance-db-sync-gzdw5\" (UID: \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\") " pod="glance-kuttl-tests/glance-db-sync-gzdw5" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.288052 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m99cx\" (UniqueName: \"kubernetes.io/projected/580f0ae9-1761-4d22-b98b-bd4af7c2752f-kube-api-access-m99cx\") pod \"glance-db-sync-gzdw5\" (UID: \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\") " pod="glance-kuttl-tests/glance-db-sync-gzdw5" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.298057 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/580f0ae9-1761-4d22-b98b-bd4af7c2752f-db-sync-config-data\") pod \"glance-db-sync-gzdw5\" (UID: \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\") " pod="glance-kuttl-tests/glance-db-sync-gzdw5" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.298067 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580f0ae9-1761-4d22-b98b-bd4af7c2752f-config-data\") pod \"glance-db-sync-gzdw5\" (UID: \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\") " pod="glance-kuttl-tests/glance-db-sync-gzdw5" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.312677 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m99cx\" (UniqueName: \"kubernetes.io/projected/580f0ae9-1761-4d22-b98b-bd4af7c2752f-kube-api-access-m99cx\") pod \"glance-db-sync-gzdw5\" (UID: \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\") " pod="glance-kuttl-tests/glance-db-sync-gzdw5" Nov 26 22:59:25 crc kubenswrapper[5008]: I1126 22:59:25.490028 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-gzdw5" Nov 26 22:59:26 crc kubenswrapper[5008]: I1126 22:59:26.005943 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-gzdw5"] Nov 26 22:59:26 crc kubenswrapper[5008]: I1126 22:59:26.244457 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-gzdw5" event={"ID":"580f0ae9-1761-4d22-b98b-bd4af7c2752f","Type":"ContainerStarted","Data":"7135764e48d17ff5a37272c01697604c705ecdaf5d850239a616205c36ae4cc4"} Nov 26 22:59:27 crc kubenswrapper[5008]: I1126 22:59:27.253527 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-gzdw5" event={"ID":"580f0ae9-1761-4d22-b98b-bd4af7c2752f","Type":"ContainerStarted","Data":"55564bbe623cba9993cd0e3a093b3755c513eac3dbbd97273556826db419c050"} Nov 26 22:59:27 crc kubenswrapper[5008]: I1126 22:59:27.282784 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-gzdw5" podStartSLOduration=2.28276031 podStartE2EDuration="2.28276031s" podCreationTimestamp="2025-11-26 22:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:59:27.273282922 +0000 UTC m=+1242.685976934" watchObservedRunningTime="2025-11-26 22:59:27.28276031 +0000 UTC m=+1242.695454342" Nov 26 22:59:29 crc kubenswrapper[5008]: I1126 22:59:29.273056 5008 generic.go:334] "Generic (PLEG): container finished" podID="580f0ae9-1761-4d22-b98b-bd4af7c2752f" containerID="55564bbe623cba9993cd0e3a093b3755c513eac3dbbd97273556826db419c050" exitCode=0 Nov 26 22:59:29 crc kubenswrapper[5008]: I1126 22:59:29.273122 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-gzdw5" event={"ID":"580f0ae9-1761-4d22-b98b-bd4af7c2752f","Type":"ContainerDied","Data":"55564bbe623cba9993cd0e3a093b3755c513eac3dbbd97273556826db419c050"} Nov 26 22:59:30 crc kubenswrapper[5008]: I1126 22:59:30.684270 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-gzdw5" Nov 26 22:59:30 crc kubenswrapper[5008]: I1126 22:59:30.874136 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/580f0ae9-1761-4d22-b98b-bd4af7c2752f-db-sync-config-data\") pod \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\" (UID: \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\") " Nov 26 22:59:30 crc kubenswrapper[5008]: I1126 22:59:30.874281 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580f0ae9-1761-4d22-b98b-bd4af7c2752f-config-data\") pod \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\" (UID: \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\") " Nov 26 22:59:30 crc kubenswrapper[5008]: I1126 22:59:30.874335 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m99cx\" (UniqueName: \"kubernetes.io/projected/580f0ae9-1761-4d22-b98b-bd4af7c2752f-kube-api-access-m99cx\") pod \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\" (UID: \"580f0ae9-1761-4d22-b98b-bd4af7c2752f\") " Nov 26 22:59:30 crc kubenswrapper[5008]: I1126 22:59:30.883286 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580f0ae9-1761-4d22-b98b-bd4af7c2752f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "580f0ae9-1761-4d22-b98b-bd4af7c2752f" (UID: "580f0ae9-1761-4d22-b98b-bd4af7c2752f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:30 crc kubenswrapper[5008]: I1126 22:59:30.883663 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580f0ae9-1761-4d22-b98b-bd4af7c2752f-kube-api-access-m99cx" (OuterVolumeSpecName: "kube-api-access-m99cx") pod "580f0ae9-1761-4d22-b98b-bd4af7c2752f" (UID: "580f0ae9-1761-4d22-b98b-bd4af7c2752f"). InnerVolumeSpecName "kube-api-access-m99cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:59:30 crc kubenswrapper[5008]: I1126 22:59:30.936645 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580f0ae9-1761-4d22-b98b-bd4af7c2752f-config-data" (OuterVolumeSpecName: "config-data") pod "580f0ae9-1761-4d22-b98b-bd4af7c2752f" (UID: "580f0ae9-1761-4d22-b98b-bd4af7c2752f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:30 crc kubenswrapper[5008]: I1126 22:59:30.976446 5008 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/580f0ae9-1761-4d22-b98b-bd4af7c2752f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:30 crc kubenswrapper[5008]: I1126 22:59:30.976521 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580f0ae9-1761-4d22-b98b-bd4af7c2752f-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:30 crc kubenswrapper[5008]: I1126 22:59:30.976548 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m99cx\" (UniqueName: \"kubernetes.io/projected/580f0ae9-1761-4d22-b98b-bd4af7c2752f-kube-api-access-m99cx\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:31 crc kubenswrapper[5008]: I1126 22:59:31.305421 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-gzdw5" event={"ID":"580f0ae9-1761-4d22-b98b-bd4af7c2752f","Type":"ContainerDied","Data":"7135764e48d17ff5a37272c01697604c705ecdaf5d850239a616205c36ae4cc4"} Nov 26 22:59:31 crc kubenswrapper[5008]: I1126 22:59:31.305513 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7135764e48d17ff5a37272c01697604c705ecdaf5d850239a616205c36ae4cc4" Nov 26 22:59:31 crc kubenswrapper[5008]: I1126 22:59:31.305672 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-gzdw5" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.603733 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 22:59:32 crc kubenswrapper[5008]: E1126 22:59:32.604175 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580f0ae9-1761-4d22-b98b-bd4af7c2752f" containerName="glance-db-sync" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.604198 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="580f0ae9-1761-4d22-b98b-bd4af7c2752f" containerName="glance-db-sync" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.604496 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="580f0ae9-1761-4d22-b98b-bd4af7c2752f" containerName="glance-db-sync" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.605946 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.609407 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-p9xcr" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.609653 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.610064 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.631648 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.711158 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-dev\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.711506 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-sys\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.711549 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.711587 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-run\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.711605 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.711625 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ck8c\" (UniqueName: \"kubernetes.io/projected/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-kube-api-access-4ck8c\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.711643 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-scripts\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.711660 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.711690 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.711723 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.711926 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-config-data\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.712062 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-logs\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.712126 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.712211 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.780172 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.781702 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.788152 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.803008 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818152 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818234 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-config-data\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818302 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-logs\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818345 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818422 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818493 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-dev\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818543 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-sys\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818582 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818712 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-run\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818744 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818803 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-scripts\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818830 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ck8c\" (UniqueName: \"kubernetes.io/projected/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-kube-api-access-4ck8c\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818872 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.818952 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.819110 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.819161 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-sys\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.819235 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.819487 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.819583 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.820012 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-logs\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.820078 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.820109 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-run\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.820177 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.820828 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-dev\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.821126 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.825831 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-scripts\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.829712 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-config-data\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.847380 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.847973 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.849806 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ck8c\" (UniqueName: \"kubernetes.io/projected/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-kube-api-access-4ck8c\") pod \"glance-default-external-api-0\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.920865 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.920921 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.920955 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.921015 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-sys\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.921039 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-logs\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.921077 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.921096 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.921119 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-run\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.921159 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.921188 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-dev\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.921245 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.921270 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.921300 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dlmd\" (UniqueName: \"kubernetes.io/projected/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-kube-api-access-2dlmd\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.921352 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:32 crc kubenswrapper[5008]: I1126 22:59:32.975119 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023085 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023130 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023158 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-run\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023205 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023234 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-dev\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023272 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023295 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023319 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dlmd\" (UniqueName: \"kubernetes.io/projected/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-kube-api-access-2dlmd\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023372 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023395 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023429 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023458 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023480 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-sys\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.023500 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-logs\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.024048 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-logs\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.024150 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.027223 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.027343 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.027560 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-dev\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.027626 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.027751 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.027781 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-run\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.027752 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.027860 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-sys\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.028348 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.035585 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.038881 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.048137 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.051826 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.055686 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dlmd\" (UniqueName: \"kubernetes.io/projected/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-kube-api-access-2dlmd\") pod \"glance-default-internal-api-0\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.113066 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.389485 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.441001 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 22:59:33 crc kubenswrapper[5008]: W1126 22:59:33.451898 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4d1aa49_4ec3_4f0e_ae36_d0359d195fff.slice/crio-7e637a27f9f6b4ea8066077ef7f25f4bf77f2a3fa12f210ae4f75dcdd8b893c3 WatchSource:0}: Error finding container 7e637a27f9f6b4ea8066077ef7f25f4bf77f2a3fa12f210ae4f75dcdd8b893c3: Status 404 returned error can't find the container with id 7e637a27f9f6b4ea8066077ef7f25f4bf77f2a3fa12f210ae4f75dcdd8b893c3 Nov 26 22:59:33 crc kubenswrapper[5008]: I1126 22:59:33.531897 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.333640 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec","Type":"ContainerStarted","Data":"aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a"} Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.334274 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec","Type":"ContainerStarted","Data":"758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c"} Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.334291 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec","Type":"ContainerStarted","Data":"fba01039f1ee5dcba027ceafa43c43fa450f614068eb8cab7d0c5949b9f94d64"} Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.333866 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" containerName="glance-httpd" containerID="cri-o://aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a" gracePeriod=30 Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.333795 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" containerName="glance-log" containerID="cri-o://758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c" gracePeriod=30 Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.338739 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff","Type":"ContainerStarted","Data":"036f02d70fe68dfdde18282da4ad181daf3d9d7c9b0efce30f4e75a31b5a061a"} Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.338777 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff","Type":"ContainerStarted","Data":"8340eb89c10fc7062c2ee6b1c408bd2024cd8ff3bcdc683c14004139c13e8de7"} Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.338790 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff","Type":"ContainerStarted","Data":"7e637a27f9f6b4ea8066077ef7f25f4bf77f2a3fa12f210ae4f75dcdd8b893c3"} Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.375914 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.375892041 podStartE2EDuration="3.375892041s" podCreationTimestamp="2025-11-26 22:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:59:34.368388425 +0000 UTC m=+1249.781082447" watchObservedRunningTime="2025-11-26 22:59:34.375892041 +0000 UTC m=+1249.788586053" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.397853 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.39783888 podStartE2EDuration="2.39783888s" podCreationTimestamp="2025-11-26 22:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:59:34.392872864 +0000 UTC m=+1249.805566906" watchObservedRunningTime="2025-11-26 22:59:34.39783888 +0000 UTC m=+1249.810532882" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.758653 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.850234 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-etc-nvme\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.850474 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.850522 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dlmd\" (UniqueName: \"kubernetes.io/projected/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-kube-api-access-2dlmd\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.850544 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-scripts\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.850566 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-lib-modules\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.850587 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.850613 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-etc-iscsi\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.850637 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-run\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.851053 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.851120 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.851235 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-run" (OuterVolumeSpecName: "run") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.851399 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-logs\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.851452 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-var-locks-brick\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.851479 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-sys\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.851507 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-dev\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.851739 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-httpd-run\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.851742 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.851755 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.851865 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-config-data\") pod \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\" (UID: \"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec\") " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.852586 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.852608 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.852628 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.852645 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.852663 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.852636 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-sys" (OuterVolumeSpecName: "sys") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.852757 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.852849 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-logs" (OuterVolumeSpecName: "logs") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.852549 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-dev" (OuterVolumeSpecName: "dev") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.858771 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.860528 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-scripts" (OuterVolumeSpecName: "scripts") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.862145 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-kube-api-access-2dlmd" (OuterVolumeSpecName: "kube-api-access-2dlmd") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "kube-api-access-2dlmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.862291 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance-cache") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.930037 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-config-data" (OuterVolumeSpecName: "config-data") pod "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" (UID: "df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.953863 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-dev\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.953895 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.953927 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.953936 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.953946 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dlmd\" (UniqueName: \"kubernetes.io/projected/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-kube-api-access-2dlmd\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.953955 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.953970 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.953989 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-logs\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.953998 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec-sys\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.966260 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 26 22:59:34 crc kubenswrapper[5008]: I1126 22:59:34.986568 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.055945 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.055997 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.350837 5008 generic.go:334] "Generic (PLEG): container finished" podID="df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" containerID="aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a" exitCode=143 Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.351281 5008 generic.go:334] "Generic (PLEG): container finished" podID="df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" containerID="758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c" exitCode=143 Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.350929 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec","Type":"ContainerDied","Data":"aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a"} Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.350956 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.351366 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec","Type":"ContainerDied","Data":"758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c"} Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.351399 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec","Type":"ContainerDied","Data":"fba01039f1ee5dcba027ceafa43c43fa450f614068eb8cab7d0c5949b9f94d64"} Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.351418 5008 scope.go:117] "RemoveContainer" containerID="aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.385927 5008 scope.go:117] "RemoveContainer" containerID="758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.404084 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.414034 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.425090 5008 scope.go:117] "RemoveContainer" containerID="aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a" Nov 26 22:59:35 crc kubenswrapper[5008]: E1126 22:59:35.425864 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a\": container with ID starting with aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a not found: ID does not exist" containerID="aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.425918 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a"} err="failed to get container status \"aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a\": rpc error: code = NotFound desc = could not find container \"aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a\": container with ID starting with aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a not found: ID does not exist" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.425952 5008 scope.go:117] "RemoveContainer" containerID="758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c" Nov 26 22:59:35 crc kubenswrapper[5008]: E1126 22:59:35.426661 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c\": container with ID starting with 758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c not found: ID does not exist" containerID="758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.426733 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c"} err="failed to get container status \"758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c\": rpc error: code = NotFound desc = could not find container \"758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c\": container with ID starting with 758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c not found: ID does not exist" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.426770 5008 scope.go:117] "RemoveContainer" containerID="aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.427617 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a"} err="failed to get container status \"aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a\": rpc error: code = NotFound desc = could not find container \"aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a\": container with ID starting with aef9119bc48230e56837d9ade843af2779c6049a06a02a7cf4eb58bb3434958a not found: ID does not exist" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.427659 5008 scope.go:117] "RemoveContainer" containerID="758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.428250 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c"} err="failed to get container status \"758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c\": rpc error: code = NotFound desc = could not find container \"758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c\": container with ID starting with 758ba60bc0fa4ab1d253e0cc08593c5485f6b1fec744c7e67d7b27899ad0ca0c not found: ID does not exist" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.434524 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:59:35 crc kubenswrapper[5008]: E1126 22:59:35.434871 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" containerName="glance-httpd" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.434892 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" containerName="glance-httpd" Nov 26 22:59:35 crc kubenswrapper[5008]: E1126 22:59:35.434920 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" containerName="glance-log" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.434928 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" containerName="glance-log" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.435108 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" containerName="glance-httpd" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.435135 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" containerName="glance-log" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.436025 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.439268 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.449098 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.531734 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec" path="/var/lib/kubelet/pods/df42ba8e-d0f3-45d1-9c6d-0e61b1d1fdec/volumes" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.570965 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-run\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.571068 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e4686af-ab67-4a04-a97f-48508ff06a2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.571095 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xddl\" (UniqueName: \"kubernetes.io/projected/8e4686af-ab67-4a04-a97f-48508ff06a2e-kube-api-access-8xddl\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.571334 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e4686af-ab67-4a04-a97f-48508ff06a2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.571528 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e4686af-ab67-4a04-a97f-48508ff06a2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.571653 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-sys\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.571720 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.571813 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-dev\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.571992 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.572075 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.572140 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8e4686af-ab67-4a04-a97f-48508ff06a2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.572186 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.572434 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.572478 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.674463 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8e4686af-ab67-4a04-a97f-48508ff06a2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.674724 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.674778 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.674825 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.674873 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-run\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.674916 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.674933 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e4686af-ab67-4a04-a97f-48508ff06a2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.674951 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675002 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xddl\" (UniqueName: \"kubernetes.io/projected/8e4686af-ab67-4a04-a97f-48508ff06a2e-kube-api-access-8xddl\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675125 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e4686af-ab67-4a04-a97f-48508ff06a2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675219 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e4686af-ab67-4a04-a97f-48508ff06a2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675209 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-run\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675278 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-sys\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675223 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675336 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8e4686af-ab67-4a04-a97f-48508ff06a2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675356 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-sys\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675371 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675399 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675449 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-dev\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675486 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-dev\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675581 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675610 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675661 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.675773 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.676070 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e4686af-ab67-4a04-a97f-48508ff06a2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.683032 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e4686af-ab67-4a04-a97f-48508ff06a2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.684115 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e4686af-ab67-4a04-a97f-48508ff06a2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.707063 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xddl\" (UniqueName: \"kubernetes.io/projected/8e4686af-ab67-4a04-a97f-48508ff06a2e-kube-api-access-8xddl\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.713866 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.722787 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:35 crc kubenswrapper[5008]: I1126 22:59:35.776959 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:36 crc kubenswrapper[5008]: I1126 22:59:36.274025 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 22:59:36 crc kubenswrapper[5008]: I1126 22:59:36.364489 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"8e4686af-ab67-4a04-a97f-48508ff06a2e","Type":"ContainerStarted","Data":"789deabd240c3061a4ad9b95415eba75d1082ac206c50cfd73637d27e98ac84c"} Nov 26 22:59:37 crc kubenswrapper[5008]: I1126 22:59:37.383352 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"8e4686af-ab67-4a04-a97f-48508ff06a2e","Type":"ContainerStarted","Data":"55ecd3c7ce870459cb822971b9f95c2d4717bf97e84ee95831d11acf572ca6e5"} Nov 26 22:59:37 crc kubenswrapper[5008]: I1126 22:59:37.384105 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"8e4686af-ab67-4a04-a97f-48508ff06a2e","Type":"ContainerStarted","Data":"f749b687edb24b365e19d42c56ef725c0e6d7204c5df5c2b5f870d295f0b8764"} Nov 26 22:59:37 crc kubenswrapper[5008]: I1126 22:59:37.421428 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.421404827 podStartE2EDuration="2.421404827s" podCreationTimestamp="2025-11-26 22:59:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:59:37.41732547 +0000 UTC m=+1252.830019552" watchObservedRunningTime="2025-11-26 22:59:37.421404827 +0000 UTC m=+1252.834098869" Nov 26 22:59:42 crc kubenswrapper[5008]: I1126 22:59:42.976095 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:42 crc kubenswrapper[5008]: I1126 22:59:42.977007 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:43 crc kubenswrapper[5008]: I1126 22:59:43.004435 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:43 crc kubenswrapper[5008]: I1126 22:59:43.040479 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:43 crc kubenswrapper[5008]: I1126 22:59:43.447497 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:43 crc kubenswrapper[5008]: I1126 22:59:43.447554 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:45 crc kubenswrapper[5008]: I1126 22:59:45.338477 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:45 crc kubenswrapper[5008]: I1126 22:59:45.386103 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 22:59:45 crc kubenswrapper[5008]: I1126 22:59:45.777912 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:45 crc kubenswrapper[5008]: I1126 22:59:45.778031 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:45 crc kubenswrapper[5008]: I1126 22:59:45.806790 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:45 crc kubenswrapper[5008]: I1126 22:59:45.844707 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:46 crc kubenswrapper[5008]: I1126 22:59:46.478331 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:46 crc kubenswrapper[5008]: I1126 22:59:46.478393 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:48 crc kubenswrapper[5008]: I1126 22:59:48.355090 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:48 crc kubenswrapper[5008]: I1126 22:59:48.390031 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.656818 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.660051 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.663778 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.665756 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.689029 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.706512 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.719143 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.720546 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.722955 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.725074 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.765490 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.765597 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.765637 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c5b25d7-9f9f-4962-874e-01dc54b179a8-config-data\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.765667 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.765706 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-sys\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.765740 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c5b25d7-9f9f-4962-874e-01dc54b179a8-logs\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.765777 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf41e648-346e-4019-8a38-80d1508dd2d2-logs\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.765998 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766096 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf41e648-346e-4019-8a38-80d1508dd2d2-config-data\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766134 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-dev\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766186 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766218 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tszmh\" (UniqueName: \"kubernetes.io/projected/1c5b25d7-9f9f-4962-874e-01dc54b179a8-kube-api-access-tszmh\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766320 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf41e648-346e-4019-8a38-80d1508dd2d2-scripts\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766441 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c72nw\" (UniqueName: \"kubernetes.io/projected/cf41e648-346e-4019-8a38-80d1508dd2d2-kube-api-access-c72nw\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766531 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766622 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf41e648-346e-4019-8a38-80d1508dd2d2-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766682 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766733 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766796 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766871 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-run\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766913 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-run\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.766952 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.767078 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.767110 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c5b25d7-9f9f-4962-874e-01dc54b179a8-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.767162 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-sys\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.767343 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.767488 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c5b25d7-9f9f-4962-874e-01dc54b179a8-scripts\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.767552 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-dev\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.770703 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.779481 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.868844 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-dev\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.868923 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf41e648-346e-4019-8a38-80d1508dd2d2-config-data\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.868956 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-dev\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869013 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869050 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tszmh\" (UniqueName: \"kubernetes.io/projected/1c5b25d7-9f9f-4962-874e-01dc54b179a8-kube-api-access-tszmh\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869083 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf41e648-346e-4019-8a38-80d1508dd2d2-scripts\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869114 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869147 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-logs\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869194 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c72nw\" (UniqueName: \"kubernetes.io/projected/cf41e648-346e-4019-8a38-80d1508dd2d2-kube-api-access-c72nw\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869246 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-run\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869277 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869318 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf41e648-346e-4019-8a38-80d1508dd2d2-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869356 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869393 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869435 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869471 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvwgh\" (UniqueName: \"kubernetes.io/projected/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-kube-api-access-dvwgh\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869530 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869568 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869634 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869650 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-dev\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869946 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.869999 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.870227 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.870297 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.870358 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-run\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.870407 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-run\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.870456 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.870509 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.870553 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c5b25d7-9f9f-4962-874e-01dc54b179a8-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.870607 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.870618 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-run\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.870663 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-run\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.870749 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf41e648-346e-4019-8a38-80d1508dd2d2-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.870898 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871004 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-sys\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871071 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-sys\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871074 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-config-data\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871131 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c5b25d7-9f9f-4962-874e-01dc54b179a8-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871186 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871250 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871316 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c5b25d7-9f9f-4962-874e-01dc54b179a8-scripts\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871355 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871388 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-dev\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871447 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871502 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-sys\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871555 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871577 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-dev\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871613 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c5b25d7-9f9f-4962-874e-01dc54b179a8-config-data\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871661 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871699 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871715 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871767 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871822 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-sys\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871876 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.871920 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c5b25d7-9f9f-4962-874e-01dc54b179a8-logs\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.872006 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-scripts\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.872056 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf41e648-346e-4019-8a38-80d1508dd2d2-logs\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.872101 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.872264 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.872443 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.872477 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-sys\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.873119 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf41e648-346e-4019-8a38-80d1508dd2d2-logs\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.873182 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.873721 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c5b25d7-9f9f-4962-874e-01dc54b179a8-logs\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.877339 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.887354 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c5b25d7-9f9f-4962-874e-01dc54b179a8-scripts\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.888762 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c5b25d7-9f9f-4962-874e-01dc54b179a8-config-data\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.889036 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf41e648-346e-4019-8a38-80d1508dd2d2-config-data\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.889845 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf41e648-346e-4019-8a38-80d1508dd2d2-scripts\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.895665 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c72nw\" (UniqueName: \"kubernetes.io/projected/cf41e648-346e-4019-8a38-80d1508dd2d2-kube-api-access-c72nw\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.895793 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.902461 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tszmh\" (UniqueName: \"kubernetes.io/projected/1c5b25d7-9f9f-4962-874e-01dc54b179a8-kube-api-access-tszmh\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.909179 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.918073 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.926203 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-1\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.973892 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974098 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974196 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974274 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmd98\" (UniqueName: \"kubernetes.io/projected/1df15eb2-fdba-49b3-831e-9ff30608c249-kube-api-access-pmd98\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974378 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-scripts\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974438 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-dev\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974491 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974513 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-logs\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974577 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df15eb2-fdba-49b3-831e-9ff30608c249-scripts\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974585 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-dev\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974614 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974696 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-run\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974724 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-run\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974746 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") device mount path \"/mnt/openstack/pv20\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.974820 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-run\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975042 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975089 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvwgh\" (UniqueName: \"kubernetes.io/projected/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-kube-api-access-dvwgh\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975162 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-sys\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975194 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1df15eb2-fdba-49b3-831e-9ff30608c249-logs\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975201 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975407 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-dev\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975448 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975516 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975625 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975687 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-config-data\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975766 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975807 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df15eb2-fdba-49b3-831e-9ff30608c249-config-data\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975877 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975907 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1df15eb2-fdba-49b3-831e-9ff30608c249-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.975984 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-sys\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.976039 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.976171 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-logs\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.976180 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.976233 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.976270 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-sys\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.976320 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.976387 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.976446 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.976486 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.978109 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.982069 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-config-data\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.982827 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-scripts\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:50 crc kubenswrapper[5008]: I1126 22:59:50.996073 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvwgh\" (UniqueName: \"kubernetes.io/projected/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-kube-api-access-dvwgh\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.001744 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.002281 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.002365 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.002863 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.024406 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.051369 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.079356 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df15eb2-fdba-49b3-831e-9ff30608c249-config-data\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.079414 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.079446 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1df15eb2-fdba-49b3-831e-9ff30608c249-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.079498 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.079548 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.079577 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmd98\" (UniqueName: \"kubernetes.io/projected/1df15eb2-fdba-49b3-831e-9ff30608c249-kube-api-access-pmd98\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.079651 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df15eb2-fdba-49b3-831e-9ff30608c249-scripts\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.079646 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.079743 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.080853 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1df15eb2-fdba-49b3-831e-9ff30608c249-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.080919 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.083014 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.079685 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.083702 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-run\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.083874 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-sys\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.083909 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1df15eb2-fdba-49b3-831e-9ff30608c249-logs\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.083944 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-dev\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.084011 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.084105 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-sys\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.084241 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-run\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.084255 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.084288 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-dev\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.084567 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1df15eb2-fdba-49b3-831e-9ff30608c249-logs\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.086617 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df15eb2-fdba-49b3-831e-9ff30608c249-scripts\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.087485 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df15eb2-fdba-49b3-831e-9ff30608c249-config-data\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.116015 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmd98\" (UniqueName: \"kubernetes.io/projected/1df15eb2-fdba-49b3-831e-9ff30608c249-kube-api-access-pmd98\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.123215 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-2\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.369910 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.439555 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 22:59:51 crc kubenswrapper[5008]: W1126 22:59:51.446548 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c5b25d7_9f9f_4962_874e_01dc54b179a8.slice/crio-09faa6a915af7abb5bb954fd4dc4fe5742c56ac7836b09e194ae124551444780 WatchSource:0}: Error finding container 09faa6a915af7abb5bb954fd4dc4fe5742c56ac7836b09e194ae124551444780: Status 404 returned error can't find the container with id 09faa6a915af7abb5bb954fd4dc4fe5742c56ac7836b09e194ae124551444780 Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.530645 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.530676 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"1c5b25d7-9f9f-4962-874e-01dc54b179a8","Type":"ContainerStarted","Data":"09faa6a915af7abb5bb954fd4dc4fe5742c56ac7836b09e194ae124551444780"} Nov 26 22:59:51 crc kubenswrapper[5008]: W1126 22:59:51.537499 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf41e648_346e_4019_8a38_80d1508dd2d2.slice/crio-fc6827bfd6137d7fbb8fd960ffaec5795869fe1ace2bf337b088a897e7a300fe WatchSource:0}: Error finding container fc6827bfd6137d7fbb8fd960ffaec5795869fe1ace2bf337b088a897e7a300fe: Status 404 returned error can't find the container with id fc6827bfd6137d7fbb8fd960ffaec5795869fe1ace2bf337b088a897e7a300fe Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.561803 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 22:59:51 crc kubenswrapper[5008]: W1126 22:59:51.570327 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f23a622_9f8e_42cb_b6a8_b8e2fce1e99d.slice/crio-8d0336f891bfc23eb086e87c912b63783365afd890be1db2a6f5160930ccefc4 WatchSource:0}: Error finding container 8d0336f891bfc23eb086e87c912b63783365afd890be1db2a6f5160930ccefc4: Status 404 returned error can't find the container with id 8d0336f891bfc23eb086e87c912b63783365afd890be1db2a6f5160930ccefc4 Nov 26 22:59:51 crc kubenswrapper[5008]: I1126 22:59:51.650315 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.536398 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"1df15eb2-fdba-49b3-831e-9ff30608c249","Type":"ContainerStarted","Data":"de27d0158d9b4b33c7060b41d4fbacb1f18aeaae6e3d203c586e4cf10e0328c3"} Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.537401 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"1df15eb2-fdba-49b3-831e-9ff30608c249","Type":"ContainerStarted","Data":"7b46dde8bed842e1bea0c357187cbb0a262e57b11275fc456506add6e86bad9c"} Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.537444 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"1df15eb2-fdba-49b3-831e-9ff30608c249","Type":"ContainerStarted","Data":"6df8db82837475a7f174f0e2de6607faeea55234c1d3bdf4422b96bfaf818d66"} Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.539730 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"cf41e648-346e-4019-8a38-80d1508dd2d2","Type":"ContainerStarted","Data":"3e66c76616ad120c07ab33d111bed66ba7f2191bfb4d02dc3c1216191575dd72"} Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.539822 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"cf41e648-346e-4019-8a38-80d1508dd2d2","Type":"ContainerStarted","Data":"f68ddbee848e96ccbbdf85288e5752b18e0e10c5a723022ca70176edb1042c55"} Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.539853 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"cf41e648-346e-4019-8a38-80d1508dd2d2","Type":"ContainerStarted","Data":"fc6827bfd6137d7fbb8fd960ffaec5795869fe1ace2bf337b088a897e7a300fe"} Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.543869 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d","Type":"ContainerStarted","Data":"f4ef23dce81c940e4b3b566f6de12c69a4f05aac6498107f3f55e41033094034"} Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.543938 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d","Type":"ContainerStarted","Data":"2d950477c7980bc552685cb0a2525fb49f76d35b18998ddd8e59a38e33507898"} Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.544021 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d","Type":"ContainerStarted","Data":"8d0336f891bfc23eb086e87c912b63783365afd890be1db2a6f5160930ccefc4"} Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.547270 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"1c5b25d7-9f9f-4962-874e-01dc54b179a8","Type":"ContainerStarted","Data":"1e78ed00c65d181e0ccaaf41266ab01b3970c036d88608ffa711bca10e70818d"} Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.547318 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"1c5b25d7-9f9f-4962-874e-01dc54b179a8","Type":"ContainerStarted","Data":"0a3539f87f23a1657de7bf2fb9b018ce7c74820d36d1ebeaa0602bf7313a2558"} Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.586768 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=3.5867275149999998 podStartE2EDuration="3.586727515s" podCreationTimestamp="2025-11-26 22:59:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:59:52.573885742 +0000 UTC m=+1267.986579784" watchObservedRunningTime="2025-11-26 22:59:52.586727515 +0000 UTC m=+1267.999421557" Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.617877 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=3.617847412 podStartE2EDuration="3.617847412s" podCreationTimestamp="2025-11-26 22:59:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:59:52.608684014 +0000 UTC m=+1268.021378026" watchObservedRunningTime="2025-11-26 22:59:52.617847412 +0000 UTC m=+1268.030541454" Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.650319 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.650295572 podStartE2EDuration="3.650295572s" podCreationTimestamp="2025-11-26 22:59:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:59:52.635470766 +0000 UTC m=+1268.048164778" watchObservedRunningTime="2025-11-26 22:59:52.650295572 +0000 UTC m=+1268.062989604" Nov 26 22:59:52 crc kubenswrapper[5008]: I1126 22:59:52.670665 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=3.6706427010000002 podStartE2EDuration="3.670642701s" podCreationTimestamp="2025-11-26 22:59:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 22:59:52.662630009 +0000 UTC m=+1268.075324051" watchObservedRunningTime="2025-11-26 22:59:52.670642701 +0000 UTC m=+1268.083336713" Nov 26 22:59:59 crc kubenswrapper[5008]: I1126 22:59:59.281129 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 22:59:59 crc kubenswrapper[5008]: I1126 22:59:59.281888 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.155552 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.157423 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.170856 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.172770 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.182819 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.184040 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.191623 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.193952 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.200894 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.202129 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.209313 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.210245 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.215941 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.222500 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.228935 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.234702 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.239944 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.253019 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.254182 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.256196 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.256221 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.258676 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.261601 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/434f550d-aa2c-4737-a324-236570bcd516-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz\" (UID: \"434f550d-aa2c-4737-a324-236570bcd516\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.261659 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn\" (UID: \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.261693 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/54ecc7bd-bb80-4362-8b26-d1f7f78b1455-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn\" (UID: \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.261720 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6mf5\" (UniqueName: \"kubernetes.io/projected/285f499e-c63a-45ed-9d95-9ad362cd5c69-kube-api-access-r6mf5\") pod \"glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5\" (UID: \"285f499e-c63a-45ed-9d95-9ad362cd5c69\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.261751 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz\" (UID: \"434f550d-aa2c-4737-a324-236570bcd516\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.261792 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xg62\" (UniqueName: \"kubernetes.io/projected/c7b1631a-df7b-468c-af5b-c2b122fc10c2-kube-api-access-9xg62\") pod \"glance-cache-glance-default-internal-api-2-cleaner-29403309n24g\" (UID: \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.261822 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgfb2\" (UniqueName: \"kubernetes.io/projected/434f550d-aa2c-4737-a324-236570bcd516-kube-api-access-xgfb2\") pod \"glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz\" (UID: \"434f550d-aa2c-4737-a324-236570bcd516\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.263127 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/f93d1a20-d7fd-441f-a496-c3cb5423ba2c-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-2-cleaner-29403307x27r\" (UID: \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.263758 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h82vb\" (UniqueName: \"kubernetes.io/projected/54ecc7bd-bb80-4362-8b26-d1f7f78b1455-kube-api-access-h82vb\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn\" (UID: \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.263992 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5\" (UID: \"285f499e-c63a-45ed-9d95-9ad362cd5c69\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.264078 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4zck\" (UniqueName: \"kubernetes.io/projected/f93d1a20-d7fd-441f-a496-c3cb5423ba2c-kube-api-access-z4zck\") pod \"glance-cache-glance-default-external-api-2-cleaner-29403307x27r\" (UID: \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.264138 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/c7b1631a-df7b-468c-af5b-c2b122fc10c2-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-2-cleaner-29403309n24g\" (UID: \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.266834 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-cache-glance-default-internal-api-2-cleaner-29403309n24g\" (UID: \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.267026 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/285f499e-c63a-45ed-9d95-9ad362cd5c69-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5\" (UID: \"285f499e-c63a-45ed-9d95-9ad362cd5c69\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.267124 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-cache-glance-default-external-api-2-cleaner-29403307x27r\" (UID: \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.281227 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd"] Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.307989 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz\" (UID: \"434f550d-aa2c-4737-a324-236570bcd516\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.310806 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-cache-glance-default-internal-api-2-cleaner-29403309n24g\" (UID: \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.313995 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-cache-glance-default-external-api-2-cleaner-29403307x27r\" (UID: \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.314300 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5\" (UID: \"285f499e-c63a-45ed-9d95-9ad362cd5c69\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.327224 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn\" (UID: \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.374849 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h82vb\" (UniqueName: \"kubernetes.io/projected/54ecc7bd-bb80-4362-8b26-d1f7f78b1455-kube-api-access-h82vb\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn\" (UID: \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.374900 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4zck\" (UniqueName: \"kubernetes.io/projected/f93d1a20-d7fd-441f-a496-c3cb5423ba2c-kube-api-access-z4zck\") pod \"glance-cache-glance-default-external-api-2-cleaner-29403307x27r\" (UID: \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.374922 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l86jr\" (UniqueName: \"kubernetes.io/projected/4efc663c-4d69-4417-b978-e705a3157b4e-kube-api-access-l86jr\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940330g6twp\" (UID: \"4efc663c-4d69-4417-b978-e705a3157b4e\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.374944 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpzqw\" (UniqueName: \"kubernetes.io/projected/7c5820a8-589a-4676-b36e-a1c90954ef93-kube-api-access-tpzqw\") pod \"collect-profiles-29403300-hnjzd\" (UID: \"7c5820a8-589a-4676-b36e-a1c90954ef93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.375019 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/c7b1631a-df7b-468c-af5b-c2b122fc10c2-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-2-cleaner-29403309n24g\" (UID: \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.375059 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/4efc663c-4d69-4417-b978-e705a3157b4e-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940330g6twp\" (UID: \"4efc663c-4d69-4417-b978-e705a3157b4e\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.375077 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/285f499e-c63a-45ed-9d95-9ad362cd5c69-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5\" (UID: \"285f499e-c63a-45ed-9d95-9ad362cd5c69\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.375100 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940330g6twp\" (UID: \"4efc663c-4d69-4417-b978-e705a3157b4e\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.375125 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c5820a8-589a-4676-b36e-a1c90954ef93-config-volume\") pod \"collect-profiles-29403300-hnjzd\" (UID: \"7c5820a8-589a-4676-b36e-a1c90954ef93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.375505 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/434f550d-aa2c-4737-a324-236570bcd516-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz\" (UID: \"434f550d-aa2c-4737-a324-236570bcd516\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.375574 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/54ecc7bd-bb80-4362-8b26-d1f7f78b1455-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn\" (UID: \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.375630 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6mf5\" (UniqueName: \"kubernetes.io/projected/285f499e-c63a-45ed-9d95-9ad362cd5c69-kube-api-access-r6mf5\") pod \"glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5\" (UID: \"285f499e-c63a-45ed-9d95-9ad362cd5c69\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.375758 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xg62\" (UniqueName: \"kubernetes.io/projected/c7b1631a-df7b-468c-af5b-c2b122fc10c2-kube-api-access-9xg62\") pod \"glance-cache-glance-default-internal-api-2-cleaner-29403309n24g\" (UID: \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.375787 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgfb2\" (UniqueName: \"kubernetes.io/projected/434f550d-aa2c-4737-a324-236570bcd516-kube-api-access-xgfb2\") pod \"glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz\" (UID: \"434f550d-aa2c-4737-a324-236570bcd516\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.375808 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c5820a8-589a-4676-b36e-a1c90954ef93-secret-volume\") pod \"collect-profiles-29403300-hnjzd\" (UID: \"7c5820a8-589a-4676-b36e-a1c90954ef93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.375860 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/f93d1a20-d7fd-441f-a496-c3cb5423ba2c-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-2-cleaner-29403307x27r\" (UID: \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.387762 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/f93d1a20-d7fd-441f-a496-c3cb5423ba2c-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-2-cleaner-29403307x27r\" (UID: \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.387887 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/434f550d-aa2c-4737-a324-236570bcd516-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz\" (UID: \"434f550d-aa2c-4737-a324-236570bcd516\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.389195 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/c7b1631a-df7b-468c-af5b-c2b122fc10c2-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-2-cleaner-29403309n24g\" (UID: \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.389411 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/285f499e-c63a-45ed-9d95-9ad362cd5c69-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5\" (UID: \"285f499e-c63a-45ed-9d95-9ad362cd5c69\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.390516 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4zck\" (UniqueName: \"kubernetes.io/projected/f93d1a20-d7fd-441f-a496-c3cb5423ba2c-kube-api-access-z4zck\") pod \"glance-cache-glance-default-external-api-2-cleaner-29403307x27r\" (UID: \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.390930 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/54ecc7bd-bb80-4362-8b26-d1f7f78b1455-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn\" (UID: \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.391475 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h82vb\" (UniqueName: \"kubernetes.io/projected/54ecc7bd-bb80-4362-8b26-d1f7f78b1455-kube-api-access-h82vb\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn\" (UID: \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.392021 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xg62\" (UniqueName: \"kubernetes.io/projected/c7b1631a-df7b-468c-af5b-c2b122fc10c2-kube-api-access-9xg62\") pod \"glance-cache-glance-default-internal-api-2-cleaner-29403309n24g\" (UID: \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.393147 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgfb2\" (UniqueName: \"kubernetes.io/projected/434f550d-aa2c-4737-a324-236570bcd516-kube-api-access-xgfb2\") pod \"glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz\" (UID: \"434f550d-aa2c-4737-a324-236570bcd516\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.394239 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6mf5\" (UniqueName: \"kubernetes.io/projected/285f499e-c63a-45ed-9d95-9ad362cd5c69-kube-api-access-r6mf5\") pod \"glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5\" (UID: \"285f499e-c63a-45ed-9d95-9ad362cd5c69\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.407760 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940330g6twp\" (UID: \"4efc663c-4d69-4417-b978-e705a3157b4e\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.477496 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l86jr\" (UniqueName: \"kubernetes.io/projected/4efc663c-4d69-4417-b978-e705a3157b4e-kube-api-access-l86jr\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940330g6twp\" (UID: \"4efc663c-4d69-4417-b978-e705a3157b4e\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.477553 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpzqw\" (UniqueName: \"kubernetes.io/projected/7c5820a8-589a-4676-b36e-a1c90954ef93-kube-api-access-tpzqw\") pod \"collect-profiles-29403300-hnjzd\" (UID: \"7c5820a8-589a-4676-b36e-a1c90954ef93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.477598 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/4efc663c-4d69-4417-b978-e705a3157b4e-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940330g6twp\" (UID: \"4efc663c-4d69-4417-b978-e705a3157b4e\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.477634 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c5820a8-589a-4676-b36e-a1c90954ef93-config-volume\") pod \"collect-profiles-29403300-hnjzd\" (UID: \"7c5820a8-589a-4676-b36e-a1c90954ef93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.477700 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c5820a8-589a-4676-b36e-a1c90954ef93-secret-volume\") pod \"collect-profiles-29403300-hnjzd\" (UID: \"7c5820a8-589a-4676-b36e-a1c90954ef93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.478997 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c5820a8-589a-4676-b36e-a1c90954ef93-config-volume\") pod \"collect-profiles-29403300-hnjzd\" (UID: \"7c5820a8-589a-4676-b36e-a1c90954ef93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.484025 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/4efc663c-4d69-4417-b978-e705a3157b4e-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940330g6twp\" (UID: \"4efc663c-4d69-4417-b978-e705a3157b4e\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.489917 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c5820a8-589a-4676-b36e-a1c90954ef93-secret-volume\") pod \"collect-profiles-29403300-hnjzd\" (UID: \"7c5820a8-589a-4676-b36e-a1c90954ef93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.492293 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpzqw\" (UniqueName: \"kubernetes.io/projected/7c5820a8-589a-4676-b36e-a1c90954ef93-kube-api-access-tpzqw\") pod \"collect-profiles-29403300-hnjzd\" (UID: \"7c5820a8-589a-4676-b36e-a1c90954ef93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.492684 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.495053 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l86jr\" (UniqueName: \"kubernetes.io/projected/4efc663c-4d69-4417-b978-e705a3157b4e-kube-api-access-l86jr\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940330g6twp\" (UID: \"4efc663c-4d69-4417-b978-e705a3157b4e\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.514602 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.528490 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.539390 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.549259 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.559031 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.579328 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" Nov 26 23:00:00 crc kubenswrapper[5008]: I1126 23:00:00.992767 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5"] Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.003341 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.003433 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.025085 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.025121 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.053099 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.053150 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.067713 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.114946 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.115047 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.124753 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.136867 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r"] Nov 26 23:00:01 crc kubenswrapper[5008]: W1126 23:00:01.140195 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54ecc7bd_bb80_4362_8b26_d1f7f78b1455.slice/crio-6c78328f2e92c4fc22282315c6fdb6f245f727e91927582a920e4f39ad01ac5d WatchSource:0}: Error finding container 6c78328f2e92c4fc22282315c6fdb6f245f727e91927582a920e4f39ad01ac5d: Status 404 returned error can't find the container with id 6c78328f2e92c4fc22282315c6fdb6f245f727e91927582a920e4f39ad01ac5d Nov 26 23:00:01 crc kubenswrapper[5008]: W1126 23:00:01.145907 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7b1631a_df7b_468c_af5b_c2b122fc10c2.slice/crio-f0e2f9dca8b7403b0e362f74b9b7478edc53a29644ecdbfd413a557cbd3b01d2 WatchSource:0}: Error finding container f0e2f9dca8b7403b0e362f74b9b7478edc53a29644ecdbfd413a557cbd3b01d2: Status 404 returned error can't find the container with id f0e2f9dca8b7403b0e362f74b9b7478edc53a29644ecdbfd413a557cbd3b01d2 Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.145951 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g"] Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.146121 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:00:01 crc kubenswrapper[5008]: W1126 23:00:01.150586 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf93d1a20_d7fd_441f_a496_c3cb5423ba2c.slice/crio-9a59e40063edac9b1f57bb183759b35d92b08ca86b964910a9f0e81fbfb02a18 WatchSource:0}: Error finding container 9a59e40063edac9b1f57bb183759b35d92b08ca86b964910a9f0e81fbfb02a18: Status 404 returned error can't find the container with id 9a59e40063edac9b1f57bb183759b35d92b08ca86b964910a9f0e81fbfb02a18 Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.155974 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn"] Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.161439 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.247232 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp"] Nov 26 23:00:01 crc kubenswrapper[5008]: W1126 23:00:01.248999 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4efc663c_4d69_4417_b978_e705a3157b4e.slice/crio-2a72a269bfaeec9f6a943177154287f10171d46677951e4c274bf5b5cf700457 WatchSource:0}: Error finding container 2a72a269bfaeec9f6a943177154287f10171d46677951e4c274bf5b5cf700457: Status 404 returned error can't find the container with id 2a72a269bfaeec9f6a943177154287f10171d46677951e4c274bf5b5cf700457 Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.259829 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz"] Nov 26 23:00:01 crc kubenswrapper[5008]: W1126 23:00:01.261120 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod434f550d_aa2c_4737_a324_236570bcd516.slice/crio-5da06163bef06c53ade816708bdcbc2a8d34e45e43ad0ab4832f81b9e12ce7dd WatchSource:0}: Error finding container 5da06163bef06c53ade816708bdcbc2a8d34e45e43ad0ab4832f81b9e12ce7dd: Status 404 returned error can't find the container with id 5da06163bef06c53ade816708bdcbc2a8d34e45e43ad0ab4832f81b9e12ce7dd Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.269582 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd"] Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.370625 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.371697 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.421768 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.467253 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.655805 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" event={"ID":"285f499e-c63a-45ed-9d95-9ad362cd5c69","Type":"ContainerStarted","Data":"5b10a3f1cbe01161e77e8bfed8223a2ca9d3e4e0bcc7d96e6fc8a2c941df15f8"} Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.656801 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" event={"ID":"c7b1631a-df7b-468c-af5b-c2b122fc10c2","Type":"ContainerStarted","Data":"f0e2f9dca8b7403b0e362f74b9b7478edc53a29644ecdbfd413a557cbd3b01d2"} Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.657918 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" event={"ID":"f93d1a20-d7fd-441f-a496-c3cb5423ba2c","Type":"ContainerStarted","Data":"9a59e40063edac9b1f57bb183759b35d92b08ca86b964910a9f0e81fbfb02a18"} Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.658932 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" event={"ID":"434f550d-aa2c-4737-a324-236570bcd516","Type":"ContainerStarted","Data":"5da06163bef06c53ade816708bdcbc2a8d34e45e43ad0ab4832f81b9e12ce7dd"} Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.660509 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" event={"ID":"7c5820a8-589a-4676-b36e-a1c90954ef93","Type":"ContainerStarted","Data":"64c0f2947ab87031003a9fbc73a538f52074c1487669bc33ee9616d282080a1d"} Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.660530 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" event={"ID":"7c5820a8-589a-4676-b36e-a1c90954ef93","Type":"ContainerStarted","Data":"68b042c8b425f387d42f3e1eb7894298a558829bd5809a9e53c8ffb2035f55d6"} Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.663495 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" event={"ID":"4efc663c-4d69-4417-b978-e705a3157b4e","Type":"ContainerStarted","Data":"2a72a269bfaeec9f6a943177154287f10171d46677951e4c274bf5b5cf700457"} Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.675405 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" event={"ID":"54ecc7bd-bb80-4362-8b26-d1f7f78b1455","Type":"ContainerStarted","Data":"6c78328f2e92c4fc22282315c6fdb6f245f727e91927582a920e4f39ad01ac5d"} Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.675606 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.676319 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.676684 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.677608 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.677681 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.677784 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.677845 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.677898 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:00:01 crc kubenswrapper[5008]: I1126 23:00:01.681617 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" podStartSLOduration=1.6816014460000002 podStartE2EDuration="1.681601446s" podCreationTimestamp="2025-11-26 23:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:00:01.674292495 +0000 UTC m=+1277.086986497" watchObservedRunningTime="2025-11-26 23:00:01.681601446 +0000 UTC m=+1277.094295448" Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.683212 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" event={"ID":"434f550d-aa2c-4737-a324-236570bcd516","Type":"ContainerStarted","Data":"52e1f413cf9d7924c18fe4559b53d01acc21a753fa49a074acecbb8b3465143a"} Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.685821 5008 generic.go:334] "Generic (PLEG): container finished" podID="7c5820a8-589a-4676-b36e-a1c90954ef93" containerID="64c0f2947ab87031003a9fbc73a538f52074c1487669bc33ee9616d282080a1d" exitCode=0 Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.685867 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" event={"ID":"7c5820a8-589a-4676-b36e-a1c90954ef93","Type":"ContainerDied","Data":"64c0f2947ab87031003a9fbc73a538f52074c1487669bc33ee9616d282080a1d"} Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.688158 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" event={"ID":"4efc663c-4d69-4417-b978-e705a3157b4e","Type":"ContainerStarted","Data":"b199246ba6514bd212b19da5cffe02da01a09bc833cdbbe7254f7bc3578b4ff2"} Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.690453 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" event={"ID":"54ecc7bd-bb80-4362-8b26-d1f7f78b1455","Type":"ContainerStarted","Data":"b0a544e63bd9b89fe5b2db73a586e32e6a9201c513af08641a5df0e7462f8742"} Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.699148 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" event={"ID":"285f499e-c63a-45ed-9d95-9ad362cd5c69","Type":"ContainerStarted","Data":"3229eec04d095aba2c0535299d00f7c6cc726d7d666982326f52a3c6997a22d2"} Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.704435 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" podStartSLOduration=2.704422204 podStartE2EDuration="2.704422204s" podCreationTimestamp="2025-11-26 23:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:00:02.69826835 +0000 UTC m=+1278.110962342" watchObservedRunningTime="2025-11-26 23:00:02.704422204 +0000 UTC m=+1278.117116196" Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.706034 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" event={"ID":"c7b1631a-df7b-468c-af5b-c2b122fc10c2","Type":"ContainerStarted","Data":"3d2aae74d4261c9c9da144ff3a3577e3741d6036d5ab98376112e13c774af8f0"} Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.708302 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" event={"ID":"f93d1a20-d7fd-441f-a496-c3cb5423ba2c","Type":"ContainerStarted","Data":"e18cf9ff814f026578fa4cc37aae1558adc65837ee87f6ad8a560496646fb57b"} Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.721728 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" podStartSLOduration=2.721712117 podStartE2EDuration="2.721712117s" podCreationTimestamp="2025-11-26 23:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:00:02.720015484 +0000 UTC m=+1278.132709486" watchObservedRunningTime="2025-11-26 23:00:02.721712117 +0000 UTC m=+1278.134406119" Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.759530 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" podStartSLOduration=2.759514315 podStartE2EDuration="2.759514315s" podCreationTimestamp="2025-11-26 23:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:00:02.753892699 +0000 UTC m=+1278.166586701" watchObservedRunningTime="2025-11-26 23:00:02.759514315 +0000 UTC m=+1278.172208317" Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.775492 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" podStartSLOduration=2.775479156 podStartE2EDuration="2.775479156s" podCreationTimestamp="2025-11-26 23:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:00:02.774277978 +0000 UTC m=+1278.186971980" watchObservedRunningTime="2025-11-26 23:00:02.775479156 +0000 UTC m=+1278.188173158" Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.843788 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" podStartSLOduration=2.843766971 podStartE2EDuration="2.843766971s" podCreationTimestamp="2025-11-26 23:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:00:02.800263455 +0000 UTC m=+1278.212957457" watchObservedRunningTime="2025-11-26 23:00:02.843766971 +0000 UTC m=+1278.256460973" Nov 26 23:00:02 crc kubenswrapper[5008]: I1126 23:00:02.844113 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" podStartSLOduration=2.844107412 podStartE2EDuration="2.844107412s" podCreationTimestamp="2025-11-26 23:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:00:02.82333481 +0000 UTC m=+1278.236028812" watchObservedRunningTime="2025-11-26 23:00:02.844107412 +0000 UTC m=+1278.256801414" Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.715785 5008 generic.go:334] "Generic (PLEG): container finished" podID="54ecc7bd-bb80-4362-8b26-d1f7f78b1455" containerID="b0a544e63bd9b89fe5b2db73a586e32e6a9201c513af08641a5df0e7462f8742" exitCode=0 Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.715865 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" event={"ID":"54ecc7bd-bb80-4362-8b26-d1f7f78b1455","Type":"ContainerDied","Data":"b0a544e63bd9b89fe5b2db73a586e32e6a9201c513af08641a5df0e7462f8742"} Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.717814 5008 generic.go:334] "Generic (PLEG): container finished" podID="285f499e-c63a-45ed-9d95-9ad362cd5c69" containerID="3229eec04d095aba2c0535299d00f7c6cc726d7d666982326f52a3c6997a22d2" exitCode=0 Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.717878 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" event={"ID":"285f499e-c63a-45ed-9d95-9ad362cd5c69","Type":"ContainerDied","Data":"3229eec04d095aba2c0535299d00f7c6cc726d7d666982326f52a3c6997a22d2"} Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.719379 5008 generic.go:334] "Generic (PLEG): container finished" podID="f93d1a20-d7fd-441f-a496-c3cb5423ba2c" containerID="e18cf9ff814f026578fa4cc37aae1558adc65837ee87f6ad8a560496646fb57b" exitCode=0 Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.719453 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" event={"ID":"f93d1a20-d7fd-441f-a496-c3cb5423ba2c","Type":"ContainerDied","Data":"e18cf9ff814f026578fa4cc37aae1558adc65837ee87f6ad8a560496646fb57b"} Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.721434 5008 generic.go:334] "Generic (PLEG): container finished" podID="c7b1631a-df7b-468c-af5b-c2b122fc10c2" containerID="3d2aae74d4261c9c9da144ff3a3577e3741d6036d5ab98376112e13c774af8f0" exitCode=0 Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.721472 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" event={"ID":"c7b1631a-df7b-468c-af5b-c2b122fc10c2","Type":"ContainerDied","Data":"3d2aae74d4261c9c9da144ff3a3577e3741d6036d5ab98376112e13c774af8f0"} Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.723006 5008 generic.go:334] "Generic (PLEG): container finished" podID="434f550d-aa2c-4737-a324-236570bcd516" containerID="52e1f413cf9d7924c18fe4559b53d01acc21a753fa49a074acecbb8b3465143a" exitCode=0 Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.723048 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" event={"ID":"434f550d-aa2c-4737-a324-236570bcd516","Type":"ContainerDied","Data":"52e1f413cf9d7924c18fe4559b53d01acc21a753fa49a074acecbb8b3465143a"} Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.724502 5008 generic.go:334] "Generic (PLEG): container finished" podID="4efc663c-4d69-4417-b978-e705a3157b4e" containerID="b199246ba6514bd212b19da5cffe02da01a09bc833cdbbe7254f7bc3578b4ff2" exitCode=0 Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.724657 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" event={"ID":"4efc663c-4d69-4417-b978-e705a3157b4e","Type":"ContainerDied","Data":"b199246ba6514bd212b19da5cffe02da01a09bc833cdbbe7254f7bc3578b4ff2"} Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.724721 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.724731 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.725386 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.725400 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.725795 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.725809 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.933695 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 23:00:03 crc kubenswrapper[5008]: I1126 23:00:03.947603 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.010655 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.029997 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.030098 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.093204 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.166425 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.204551 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.241916 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpzqw\" (UniqueName: \"kubernetes.io/projected/7c5820a8-589a-4676-b36e-a1c90954ef93-kube-api-access-tpzqw\") pod \"7c5820a8-589a-4676-b36e-a1c90954ef93\" (UID: \"7c5820a8-589a-4676-b36e-a1c90954ef93\") " Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.242020 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c5820a8-589a-4676-b36e-a1c90954ef93-secret-volume\") pod \"7c5820a8-589a-4676-b36e-a1c90954ef93\" (UID: \"7c5820a8-589a-4676-b36e-a1c90954ef93\") " Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.242204 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c5820a8-589a-4676-b36e-a1c90954ef93-config-volume\") pod \"7c5820a8-589a-4676-b36e-a1c90954ef93\" (UID: \"7c5820a8-589a-4676-b36e-a1c90954ef93\") " Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.243108 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c5820a8-589a-4676-b36e-a1c90954ef93-config-volume" (OuterVolumeSpecName: "config-volume") pod "7c5820a8-589a-4676-b36e-a1c90954ef93" (UID: "7c5820a8-589a-4676-b36e-a1c90954ef93"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.249094 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5820a8-589a-4676-b36e-a1c90954ef93-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7c5820a8-589a-4676-b36e-a1c90954ef93" (UID: "7c5820a8-589a-4676-b36e-a1c90954ef93"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.274053 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c5820a8-589a-4676-b36e-a1c90954ef93-kube-api-access-tpzqw" (OuterVolumeSpecName: "kube-api-access-tpzqw") pod "7c5820a8-589a-4676-b36e-a1c90954ef93" (UID: "7c5820a8-589a-4676-b36e-a1c90954ef93"). InnerVolumeSpecName "kube-api-access-tpzqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.343922 5008 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c5820a8-589a-4676-b36e-a1c90954ef93-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.343958 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpzqw\" (UniqueName: \"kubernetes.io/projected/7c5820a8-589a-4676-b36e-a1c90954ef93-kube-api-access-tpzqw\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.343991 5008 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c5820a8-589a-4676-b36e-a1c90954ef93-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.475269 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.602502 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.735406 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" event={"ID":"7c5820a8-589a-4676-b36e-a1c90954ef93","Type":"ContainerDied","Data":"68b042c8b425f387d42f3e1eb7894298a558829bd5809a9e53c8ffb2035f55d6"} Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.735468 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68b042c8b425f387d42f3e1eb7894298a558829bd5809a9e53c8ffb2035f55d6" Nov 26 23:00:04 crc kubenswrapper[5008]: I1126 23:00:04.735634 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29403300-hnjzd" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.040190 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.127044 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.188811 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"434f550d-aa2c-4737-a324-236570bcd516\" (UID: \"434f550d-aa2c-4737-a324-236570bcd516\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.188866 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/4efc663c-4d69-4417-b978-e705a3157b4e-image-cache-config-data\") pod \"4efc663c-4d69-4417-b978-e705a3157b4e\" (UID: \"4efc663c-4d69-4417-b978-e705a3157b4e\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.188889 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l86jr\" (UniqueName: \"kubernetes.io/projected/4efc663c-4d69-4417-b978-e705a3157b4e-kube-api-access-l86jr\") pod \"4efc663c-4d69-4417-b978-e705a3157b4e\" (UID: \"4efc663c-4d69-4417-b978-e705a3157b4e\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.189614 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"4efc663c-4d69-4417-b978-e705a3157b4e\" (UID: \"4efc663c-4d69-4417-b978-e705a3157b4e\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.189656 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/434f550d-aa2c-4737-a324-236570bcd516-image-cache-config-data\") pod \"434f550d-aa2c-4737-a324-236570bcd516\" (UID: \"434f550d-aa2c-4737-a324-236570bcd516\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.189687 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgfb2\" (UniqueName: \"kubernetes.io/projected/434f550d-aa2c-4737-a324-236570bcd516-kube-api-access-xgfb2\") pod \"434f550d-aa2c-4737-a324-236570bcd516\" (UID: \"434f550d-aa2c-4737-a324-236570bcd516\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.195155 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/434f550d-aa2c-4737-a324-236570bcd516-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "434f550d-aa2c-4737-a324-236570bcd516" (UID: "434f550d-aa2c-4737-a324-236570bcd516"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.196072 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "434f550d-aa2c-4737-a324-236570bcd516" (UID: "434f550d-aa2c-4737-a324-236570bcd516"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.196154 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4efc663c-4d69-4417-b978-e705a3157b4e-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "4efc663c-4d69-4417-b978-e705a3157b4e" (UID: "4efc663c-4d69-4417-b978-e705a3157b4e"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.215119 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "4efc663c-4d69-4417-b978-e705a3157b4e" (UID: "4efc663c-4d69-4417-b978-e705a3157b4e"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.215147 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/434f550d-aa2c-4737-a324-236570bcd516-kube-api-access-xgfb2" (OuterVolumeSpecName: "kube-api-access-xgfb2") pod "434f550d-aa2c-4737-a324-236570bcd516" (UID: "434f550d-aa2c-4737-a324-236570bcd516"). InnerVolumeSpecName "kube-api-access-xgfb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.215203 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4efc663c-4d69-4417-b978-e705a3157b4e-kube-api-access-l86jr" (OuterVolumeSpecName: "kube-api-access-l86jr") pod "4efc663c-4d69-4417-b978-e705a3157b4e" (UID: "4efc663c-4d69-4417-b978-e705a3157b4e"). InnerVolumeSpecName "kube-api-access-l86jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.242385 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.261478 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.268917 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.279747 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.285418 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.291807 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/285f499e-c63a-45ed-9d95-9ad362cd5c69-image-cache-config-data\") pod \"285f499e-c63a-45ed-9d95-9ad362cd5c69\" (UID: \"285f499e-c63a-45ed-9d95-9ad362cd5c69\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.291938 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"285f499e-c63a-45ed-9d95-9ad362cd5c69\" (UID: \"285f499e-c63a-45ed-9d95-9ad362cd5c69\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.292000 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6mf5\" (UniqueName: \"kubernetes.io/projected/285f499e-c63a-45ed-9d95-9ad362cd5c69-kube-api-access-r6mf5\") pod \"285f499e-c63a-45ed-9d95-9ad362cd5c69\" (UID: \"285f499e-c63a-45ed-9d95-9ad362cd5c69\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.292433 5008 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/4efc663c-4d69-4417-b978-e705a3157b4e-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.292450 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l86jr\" (UniqueName: \"kubernetes.io/projected/4efc663c-4d69-4417-b978-e705a3157b4e-kube-api-access-l86jr\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.292459 5008 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/434f550d-aa2c-4737-a324-236570bcd516-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.292467 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgfb2\" (UniqueName: \"kubernetes.io/projected/434f550d-aa2c-4737-a324-236570bcd516-kube-api-access-xgfb2\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.296415 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285f499e-c63a-45ed-9d95-9ad362cd5c69-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "285f499e-c63a-45ed-9d95-9ad362cd5c69" (UID: "285f499e-c63a-45ed-9d95-9ad362cd5c69"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.303045 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.305850 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance-cache") pod "285f499e-c63a-45ed-9d95-9ad362cd5c69" (UID: "285f499e-c63a-45ed-9d95-9ad362cd5c69"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.319265 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/285f499e-c63a-45ed-9d95-9ad362cd5c69-kube-api-access-r6mf5" (OuterVolumeSpecName: "kube-api-access-r6mf5") pod "285f499e-c63a-45ed-9d95-9ad362cd5c69" (UID: "285f499e-c63a-45ed-9d95-9ad362cd5c69"). InnerVolumeSpecName "kube-api-access-r6mf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.393180 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/c7b1631a-df7b-468c-af5b-c2b122fc10c2-image-cache-config-data\") pod \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\" (UID: \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.393243 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\" (UID: \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.393306 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4zck\" (UniqueName: \"kubernetes.io/projected/f93d1a20-d7fd-441f-a496-c3cb5423ba2c-kube-api-access-z4zck\") pod \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\" (UID: \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.393338 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/54ecc7bd-bb80-4362-8b26-d1f7f78b1455-image-cache-config-data\") pod \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\" (UID: \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.393360 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/f93d1a20-d7fd-441f-a496-c3cb5423ba2c-image-cache-config-data\") pod \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\" (UID: \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.393420 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xg62\" (UniqueName: \"kubernetes.io/projected/c7b1631a-df7b-468c-af5b-c2b122fc10c2-kube-api-access-9xg62\") pod \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\" (UID: \"c7b1631a-df7b-468c-af5b-c2b122fc10c2\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.393439 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\" (UID: \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.393491 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\" (UID: \"f93d1a20-d7fd-441f-a496-c3cb5423ba2c\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.393519 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h82vb\" (UniqueName: \"kubernetes.io/projected/54ecc7bd-bb80-4362-8b26-d1f7f78b1455-kube-api-access-h82vb\") pod \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\" (UID: \"54ecc7bd-bb80-4362-8b26-d1f7f78b1455\") " Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.393891 5008 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/285f499e-c63a-45ed-9d95-9ad362cd5c69-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.393912 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6mf5\" (UniqueName: \"kubernetes.io/projected/285f499e-c63a-45ed-9d95-9ad362cd5c69-kube-api-access-r6mf5\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.400870 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b1631a-df7b-468c-af5b-c2b122fc10c2-kube-api-access-9xg62" (OuterVolumeSpecName: "kube-api-access-9xg62") pod "c7b1631a-df7b-468c-af5b-c2b122fc10c2" (UID: "c7b1631a-df7b-468c-af5b-c2b122fc10c2"). InnerVolumeSpecName "kube-api-access-9xg62". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.407124 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "f93d1a20-d7fd-441f-a496-c3cb5423ba2c" (UID: "f93d1a20-d7fd-441f-a496-c3cb5423ba2c"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.407146 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance-cache") pod "c7b1631a-df7b-468c-af5b-c2b122fc10c2" (UID: "c7b1631a-df7b-468c-af5b-c2b122fc10c2"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.407186 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance-cache") pod "54ecc7bd-bb80-4362-8b26-d1f7f78b1455" (UID: "54ecc7bd-bb80-4362-8b26-d1f7f78b1455"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.407185 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93d1a20-d7fd-441f-a496-c3cb5423ba2c-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "f93d1a20-d7fd-441f-a496-c3cb5423ba2c" (UID: "f93d1a20-d7fd-441f-a496-c3cb5423ba2c"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.407193 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93d1a20-d7fd-441f-a496-c3cb5423ba2c-kube-api-access-z4zck" (OuterVolumeSpecName: "kube-api-access-z4zck") pod "f93d1a20-d7fd-441f-a496-c3cb5423ba2c" (UID: "f93d1a20-d7fd-441f-a496-c3cb5423ba2c"). InnerVolumeSpecName "kube-api-access-z4zck". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.414249 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ecc7bd-bb80-4362-8b26-d1f7f78b1455-kube-api-access-h82vb" (OuterVolumeSpecName: "kube-api-access-h82vb") pod "54ecc7bd-bb80-4362-8b26-d1f7f78b1455" (UID: "54ecc7bd-bb80-4362-8b26-d1f7f78b1455"). InnerVolumeSpecName "kube-api-access-h82vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.414574 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ecc7bd-bb80-4362-8b26-d1f7f78b1455-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "54ecc7bd-bb80-4362-8b26-d1f7f78b1455" (UID: "54ecc7bd-bb80-4362-8b26-d1f7f78b1455"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.417098 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b1631a-df7b-468c-af5b-c2b122fc10c2-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "c7b1631a-df7b-468c-af5b-c2b122fc10c2" (UID: "c7b1631a-df7b-468c-af5b-c2b122fc10c2"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.440877 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.464894 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.495360 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h82vb\" (UniqueName: \"kubernetes.io/projected/54ecc7bd-bb80-4362-8b26-d1f7f78b1455-kube-api-access-h82vb\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.495395 5008 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/c7b1631a-df7b-468c-af5b-c2b122fc10c2-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.495404 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4zck\" (UniqueName: \"kubernetes.io/projected/f93d1a20-d7fd-441f-a496-c3cb5423ba2c-kube-api-access-z4zck\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.495413 5008 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/54ecc7bd-bb80-4362-8b26-d1f7f78b1455-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.495421 5008 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/f93d1a20-d7fd-441f-a496-c3cb5423ba2c-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.495431 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xg62\" (UniqueName: \"kubernetes.io/projected/c7b1631a-df7b-468c-af5b-c2b122fc10c2-kube-api-access-9xg62\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.746666 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" event={"ID":"c7b1631a-df7b-468c-af5b-c2b122fc10c2","Type":"ContainerDied","Data":"f0e2f9dca8b7403b0e362f74b9b7478edc53a29644ecdbfd413a557cbd3b01d2"} Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.746716 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0e2f9dca8b7403b0e362f74b9b7478edc53a29644ecdbfd413a557cbd3b01d2" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.746801 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.749430 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" event={"ID":"434f550d-aa2c-4737-a324-236570bcd516","Type":"ContainerDied","Data":"5da06163bef06c53ade816708bdcbc2a8d34e45e43ad0ab4832f81b9e12ce7dd"} Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.749476 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5da06163bef06c53ade816708bdcbc2a8d34e45e43ad0ab4832f81b9e12ce7dd" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.749555 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.752937 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.752926 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp" event={"ID":"4efc663c-4d69-4417-b978-e705a3157b4e","Type":"ContainerDied","Data":"2a72a269bfaeec9f6a943177154287f10171d46677951e4c274bf5b5cf700457"} Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.753116 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a72a269bfaeec9f6a943177154287f10171d46677951e4c274bf5b5cf700457" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.754571 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" event={"ID":"54ecc7bd-bb80-4362-8b26-d1f7f78b1455","Type":"ContainerDied","Data":"6c78328f2e92c4fc22282315c6fdb6f245f727e91927582a920e4f39ad01ac5d"} Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.754600 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c78328f2e92c4fc22282315c6fdb6f245f727e91927582a920e4f39ad01ac5d" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.754681 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.759988 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" event={"ID":"285f499e-c63a-45ed-9d95-9ad362cd5c69","Type":"ContainerDied","Data":"5b10a3f1cbe01161e77e8bfed8223a2ca9d3e4e0bcc7d96e6fc8a2c941df15f8"} Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.760016 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b10a3f1cbe01161e77e8bfed8223a2ca9d3e4e0bcc7d96e6fc8a2c941df15f8" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.760069 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.762583 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.762746 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r" event={"ID":"f93d1a20-d7fd-441f-a496-c3cb5423ba2c","Type":"ContainerDied","Data":"9a59e40063edac9b1f57bb183759b35d92b08ca86b964910a9f0e81fbfb02a18"} Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.762792 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a59e40063edac9b1f57bb183759b35d92b08ca86b964910a9f0e81fbfb02a18" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.763553 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="1c5b25d7-9f9f-4962-874e-01dc54b179a8" containerName="glance-log" containerID="cri-o://0a3539f87f23a1657de7bf2fb9b018ce7c74820d36d1ebeaa0602bf7313a2558" gracePeriod=30 Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.763645 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="1c5b25d7-9f9f-4962-874e-01dc54b179a8" containerName="glance-httpd" containerID="cri-o://1e78ed00c65d181e0ccaaf41266ab01b3970c036d88608ffa711bca10e70818d" gracePeriod=30 Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.774399 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="1c5b25d7-9f9f-4962-874e-01dc54b179a8" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.124:9292/healthcheck\": EOF" Nov 26 23:00:05 crc kubenswrapper[5008]: I1126 23:00:05.774544 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="1c5b25d7-9f9f-4962-874e-01dc54b179a8" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.124:9292/healthcheck\": EOF" Nov 26 23:00:06 crc kubenswrapper[5008]: I1126 23:00:06.780233 5008 generic.go:334] "Generic (PLEG): container finished" podID="1c5b25d7-9f9f-4962-874e-01dc54b179a8" containerID="0a3539f87f23a1657de7bf2fb9b018ce7c74820d36d1ebeaa0602bf7313a2558" exitCode=143 Nov 26 23:00:06 crc kubenswrapper[5008]: I1126 23:00:06.780356 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"1c5b25d7-9f9f-4962-874e-01dc54b179a8","Type":"ContainerDied","Data":"0a3539f87f23a1657de7bf2fb9b018ce7c74820d36d1ebeaa0602bf7313a2558"} Nov 26 23:00:06 crc kubenswrapper[5008]: I1126 23:00:06.780670 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" containerName="glance-log" containerID="cri-o://2d950477c7980bc552685cb0a2525fb49f76d35b18998ddd8e59a38e33507898" gracePeriod=30 Nov 26 23:00:06 crc kubenswrapper[5008]: I1126 23:00:06.780722 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" containerName="glance-httpd" containerID="cri-o://f4ef23dce81c940e4b3b566f6de12c69a4f05aac6498107f3f55e41033094034" gracePeriod=30 Nov 26 23:00:06 crc kubenswrapper[5008]: I1126 23:00:06.780889 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="1df15eb2-fdba-49b3-831e-9ff30608c249" containerName="glance-log" containerID="cri-o://7b46dde8bed842e1bea0c357187cbb0a262e57b11275fc456506add6e86bad9c" gracePeriod=30 Nov 26 23:00:06 crc kubenswrapper[5008]: I1126 23:00:06.780924 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="1df15eb2-fdba-49b3-831e-9ff30608c249" containerName="glance-httpd" containerID="cri-o://de27d0158d9b4b33c7060b41d4fbacb1f18aeaae6e3d203c586e4cf10e0328c3" gracePeriod=30 Nov 26 23:00:06 crc kubenswrapper[5008]: I1126 23:00:06.781181 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="cf41e648-346e-4019-8a38-80d1508dd2d2" containerName="glance-httpd" containerID="cri-o://3e66c76616ad120c07ab33d111bed66ba7f2191bfb4d02dc3c1216191575dd72" gracePeriod=30 Nov 26 23:00:06 crc kubenswrapper[5008]: I1126 23:00:06.781156 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="cf41e648-346e-4019-8a38-80d1508dd2d2" containerName="glance-log" containerID="cri-o://f68ddbee848e96ccbbdf85288e5752b18e0e10c5a723022ca70176edb1042c55" gracePeriod=30 Nov 26 23:00:07 crc kubenswrapper[5008]: I1126 23:00:07.795284 5008 generic.go:334] "Generic (PLEG): container finished" podID="cf41e648-346e-4019-8a38-80d1508dd2d2" containerID="f68ddbee848e96ccbbdf85288e5752b18e0e10c5a723022ca70176edb1042c55" exitCode=143 Nov 26 23:00:07 crc kubenswrapper[5008]: I1126 23:00:07.795395 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"cf41e648-346e-4019-8a38-80d1508dd2d2","Type":"ContainerDied","Data":"f68ddbee848e96ccbbdf85288e5752b18e0e10c5a723022ca70176edb1042c55"} Nov 26 23:00:07 crc kubenswrapper[5008]: I1126 23:00:07.799959 5008 generic.go:334] "Generic (PLEG): container finished" podID="8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" containerID="2d950477c7980bc552685cb0a2525fb49f76d35b18998ddd8e59a38e33507898" exitCode=143 Nov 26 23:00:07 crc kubenswrapper[5008]: I1126 23:00:07.800056 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d","Type":"ContainerDied","Data":"2d950477c7980bc552685cb0a2525fb49f76d35b18998ddd8e59a38e33507898"} Nov 26 23:00:07 crc kubenswrapper[5008]: I1126 23:00:07.804441 5008 generic.go:334] "Generic (PLEG): container finished" podID="1df15eb2-fdba-49b3-831e-9ff30608c249" containerID="7b46dde8bed842e1bea0c357187cbb0a262e57b11275fc456506add6e86bad9c" exitCode=143 Nov 26 23:00:07 crc kubenswrapper[5008]: I1126 23:00:07.804505 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"1df15eb2-fdba-49b3-831e-9ff30608c249","Type":"ContainerDied","Data":"7b46dde8bed842e1bea0c357187cbb0a262e57b11275fc456506add6e86bad9c"} Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.721488 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.813763 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-var-locks-brick\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.813829 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.813863 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-dev\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.813978 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-dev" (OuterVolumeSpecName: "dev") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.814024 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-run\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.814051 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c5b25d7-9f9f-4962-874e-01dc54b179a8-scripts\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.814094 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-run" (OuterVolumeSpecName: "run") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.814120 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-etc-iscsi\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.814143 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.814177 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.814408 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-sys" (OuterVolumeSpecName: "sys") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.814908 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-sys\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.814956 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tszmh\" (UniqueName: \"kubernetes.io/projected/1c5b25d7-9f9f-4962-874e-01dc54b179a8-kube-api-access-tszmh\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815038 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-etc-nvme\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815079 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-lib-modules\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815138 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c5b25d7-9f9f-4962-874e-01dc54b179a8-config-data\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815167 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c5b25d7-9f9f-4962-874e-01dc54b179a8-logs\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815171 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815191 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815238 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c5b25d7-9f9f-4962-874e-01dc54b179a8-httpd-run\") pod \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\" (UID: \"1c5b25d7-9f9f-4962-874e-01dc54b179a8\") " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815269 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815592 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815612 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-dev\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815711 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815676 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c5b25d7-9f9f-4962-874e-01dc54b179a8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815724 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815831 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-sys\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815876 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.815897 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c5b25d7-9f9f-4962-874e-01dc54b179a8-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.816226 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c5b25d7-9f9f-4962-874e-01dc54b179a8-logs" (OuterVolumeSpecName: "logs") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.819359 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5b25d7-9f9f-4962-874e-01dc54b179a8-scripts" (OuterVolumeSpecName: "scripts") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.820026 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c5b25d7-9f9f-4962-874e-01dc54b179a8-kube-api-access-tszmh" (OuterVolumeSpecName: "kube-api-access-tszmh") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "kube-api-access-tszmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.820623 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance-cache") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.821207 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.861239 5008 generic.go:334] "Generic (PLEG): container finished" podID="1c5b25d7-9f9f-4962-874e-01dc54b179a8" containerID="1e78ed00c65d181e0ccaaf41266ab01b3970c036d88608ffa711bca10e70818d" exitCode=0 Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.861287 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"1c5b25d7-9f9f-4962-874e-01dc54b179a8","Type":"ContainerDied","Data":"1e78ed00c65d181e0ccaaf41266ab01b3970c036d88608ffa711bca10e70818d"} Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.861313 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.861330 5008 scope.go:117] "RemoveContainer" containerID="1e78ed00c65d181e0ccaaf41266ab01b3970c036d88608ffa711bca10e70818d" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.861319 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"1c5b25d7-9f9f-4962-874e-01dc54b179a8","Type":"ContainerDied","Data":"09faa6a915af7abb5bb954fd4dc4fe5742c56ac7836b09e194ae124551444780"} Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.868313 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5b25d7-9f9f-4962-874e-01dc54b179a8-config-data" (OuterVolumeSpecName: "config-data") pod "1c5b25d7-9f9f-4962-874e-01dc54b179a8" (UID: "1c5b25d7-9f9f-4962-874e-01dc54b179a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.917383 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c5b25d7-9f9f-4962-874e-01dc54b179a8-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.917445 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.917457 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tszmh\" (UniqueName: \"kubernetes.io/projected/1c5b25d7-9f9f-4962-874e-01dc54b179a8-kube-api-access-tszmh\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.917467 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c5b25d7-9f9f-4962-874e-01dc54b179a8-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.917476 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c5b25d7-9f9f-4962-874e-01dc54b179a8-logs\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.917488 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.917515 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c5b25d7-9f9f-4962-874e-01dc54b179a8-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.927690 5008 scope.go:117] "RemoveContainer" containerID="0a3539f87f23a1657de7bf2fb9b018ce7c74820d36d1ebeaa0602bf7313a2558" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.930069 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.930388 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.954372 5008 scope.go:117] "RemoveContainer" containerID="1e78ed00c65d181e0ccaaf41266ab01b3970c036d88608ffa711bca10e70818d" Nov 26 23:00:09 crc kubenswrapper[5008]: E1126 23:00:09.954779 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e78ed00c65d181e0ccaaf41266ab01b3970c036d88608ffa711bca10e70818d\": container with ID starting with 1e78ed00c65d181e0ccaaf41266ab01b3970c036d88608ffa711bca10e70818d not found: ID does not exist" containerID="1e78ed00c65d181e0ccaaf41266ab01b3970c036d88608ffa711bca10e70818d" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.954811 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e78ed00c65d181e0ccaaf41266ab01b3970c036d88608ffa711bca10e70818d"} err="failed to get container status \"1e78ed00c65d181e0ccaaf41266ab01b3970c036d88608ffa711bca10e70818d\": rpc error: code = NotFound desc = could not find container \"1e78ed00c65d181e0ccaaf41266ab01b3970c036d88608ffa711bca10e70818d\": container with ID starting with 1e78ed00c65d181e0ccaaf41266ab01b3970c036d88608ffa711bca10e70818d not found: ID does not exist" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.954841 5008 scope.go:117] "RemoveContainer" containerID="0a3539f87f23a1657de7bf2fb9b018ce7c74820d36d1ebeaa0602bf7313a2558" Nov 26 23:00:09 crc kubenswrapper[5008]: E1126 23:00:09.955385 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a3539f87f23a1657de7bf2fb9b018ce7c74820d36d1ebeaa0602bf7313a2558\": container with ID starting with 0a3539f87f23a1657de7bf2fb9b018ce7c74820d36d1ebeaa0602bf7313a2558 not found: ID does not exist" containerID="0a3539f87f23a1657de7bf2fb9b018ce7c74820d36d1ebeaa0602bf7313a2558" Nov 26 23:00:09 crc kubenswrapper[5008]: I1126 23:00:09.955445 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a3539f87f23a1657de7bf2fb9b018ce7c74820d36d1ebeaa0602bf7313a2558"} err="failed to get container status \"0a3539f87f23a1657de7bf2fb9b018ce7c74820d36d1ebeaa0602bf7313a2558\": rpc error: code = NotFound desc = could not find container \"0a3539f87f23a1657de7bf2fb9b018ce7c74820d36d1ebeaa0602bf7313a2558\": container with ID starting with 0a3539f87f23a1657de7bf2fb9b018ce7c74820d36d1ebeaa0602bf7313a2558 not found: ID does not exist" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.019418 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.019448 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.195439 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.201502 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.272852 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.327134 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.327190 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-scripts\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.327228 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-sys\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.327251 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-run\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.327320 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-sys" (OuterVolumeSpecName: "sys") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.327371 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-config-data\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.327421 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.327414 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-run" (OuterVolumeSpecName: "run") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.327458 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-logs\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.328138 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-etc-iscsi\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.328242 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-logs" (OuterVolumeSpecName: "logs") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.328252 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvwgh\" (UniqueName: \"kubernetes.io/projected/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-kube-api-access-dvwgh\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.328377 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-httpd-run\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.328444 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-var-locks-brick\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.328504 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-lib-modules\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.328561 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-dev\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.328599 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-etc-nvme\") pod \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\" (UID: \"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.330164 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.330224 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.330224 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.330256 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-dev" (OuterVolumeSpecName: "dev") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.330390 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.330557 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.332124 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-scripts" (OuterVolumeSpecName: "scripts") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.332635 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-kube-api-access-dvwgh" (OuterVolumeSpecName: "kube-api-access-dvwgh") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "kube-api-access-dvwgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.332737 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.332858 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.332895 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.332913 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.332930 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-dev\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.332946 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.332983 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-sys\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.332999 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.333015 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-logs\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.333033 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.334213 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.383167 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5"] Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.383567 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-config-data" (OuterVolumeSpecName: "config-data") pod "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" (UID: "8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.390244 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-1-cleaner-2940330d4hp5"] Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.426495 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.430886 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.433869 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvwgh\" (UniqueName: \"kubernetes.io/projected/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-kube-api-access-dvwgh\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.433899 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.433910 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.433919 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.433935 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.458598 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.464009 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535149 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf41e648-346e-4019-8a38-80d1508dd2d2-httpd-run\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535244 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535285 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-run\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535324 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmd98\" (UniqueName: \"kubernetes.io/projected/1df15eb2-fdba-49b3-831e-9ff30608c249-kube-api-access-pmd98\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535370 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf41e648-346e-4019-8a38-80d1508dd2d2-config-data\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535413 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df15eb2-fdba-49b3-831e-9ff30608c249-scripts\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535447 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-dev\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535493 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-lib-modules\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535501 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf41e648-346e-4019-8a38-80d1508dd2d2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535570 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-etc-iscsi\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535599 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-var-locks-brick\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535645 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf41e648-346e-4019-8a38-80d1508dd2d2-logs\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535705 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c72nw\" (UniqueName: \"kubernetes.io/projected/cf41e648-346e-4019-8a38-80d1508dd2d2-kube-api-access-c72nw\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535745 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535766 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-run" (OuterVolumeSpecName: "run") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535802 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df15eb2-fdba-49b3-831e-9ff30608c249-config-data\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535820 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535850 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-etc-nvme\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535833 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535857 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535885 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-dev" (OuterVolumeSpecName: "dev") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535896 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1df15eb2-fdba-49b3-831e-9ff30608c249-httpd-run\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535925 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-dev\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.535959 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-run\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.536036 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-etc-iscsi\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.536076 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.536115 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-etc-nvme\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.536160 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-sys\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.536206 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-sys\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.536252 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1df15eb2-fdba-49b3-831e-9ff30608c249-logs\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.536327 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-lib-modules\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.536368 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.536409 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-var-locks-brick\") pod \"1df15eb2-fdba-49b3-831e-9ff30608c249\" (UID: \"1df15eb2-fdba-49b3-831e-9ff30608c249\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.536466 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf41e648-346e-4019-8a38-80d1508dd2d2-scripts\") pod \"cf41e648-346e-4019-8a38-80d1508dd2d2\" (UID: \"cf41e648-346e-4019-8a38-80d1508dd2d2\") " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.536621 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf41e648-346e-4019-8a38-80d1508dd2d2-logs" (OuterVolumeSpecName: "logs") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.537032 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.537079 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.537106 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf41e648-346e-4019-8a38-80d1508dd2d2-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.537134 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.537161 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-dev\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.537187 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.537214 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.537237 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.537260 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf41e648-346e-4019-8a38-80d1508dd2d2-logs\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.537919 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.537949 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.537920 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-dev" (OuterVolumeSpecName: "dev") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.538038 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-run" (OuterVolumeSpecName: "run") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.538281 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.538323 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.538319 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.538347 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-sys" (OuterVolumeSpecName: "sys") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.538408 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-sys" (OuterVolumeSpecName: "sys") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.538490 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df15eb2-fdba-49b3-831e-9ff30608c249-scripts" (OuterVolumeSpecName: "scripts") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.538572 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1df15eb2-fdba-49b3-831e-9ff30608c249-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.538902 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1df15eb2-fdba-49b3-831e-9ff30608c249-logs" (OuterVolumeSpecName: "logs") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.539541 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance-cache") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.539678 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.540130 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf41e648-346e-4019-8a38-80d1508dd2d2-kube-api-access-c72nw" (OuterVolumeSpecName: "kube-api-access-c72nw") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "kube-api-access-c72nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.540674 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.541844 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.543023 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf41e648-346e-4019-8a38-80d1508dd2d2-scripts" (OuterVolumeSpecName: "scripts") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.553672 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df15eb2-fdba-49b3-831e-9ff30608c249-kube-api-access-pmd98" (OuterVolumeSpecName: "kube-api-access-pmd98") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "kube-api-access-pmd98". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.584620 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf41e648-346e-4019-8a38-80d1508dd2d2-config-data" (OuterVolumeSpecName: "config-data") pod "cf41e648-346e-4019-8a38-80d1508dd2d2" (UID: "cf41e648-346e-4019-8a38-80d1508dd2d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.599167 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df15eb2-fdba-49b3-831e-9ff30608c249-config-data" (OuterVolumeSpecName: "config-data") pod "1df15eb2-fdba-49b3-831e-9ff30608c249" (UID: "1df15eb2-fdba-49b3-831e-9ff30608c249"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.638752 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.639070 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmd98\" (UniqueName: \"kubernetes.io/projected/1df15eb2-fdba-49b3-831e-9ff30608c249-kube-api-access-pmd98\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.639242 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf41e648-346e-4019-8a38-80d1508dd2d2-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.639367 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df15eb2-fdba-49b3-831e-9ff30608c249-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.639485 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c72nw\" (UniqueName: \"kubernetes.io/projected/cf41e648-346e-4019-8a38-80d1508dd2d2-kube-api-access-c72nw\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.639645 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.639772 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df15eb2-fdba-49b3-831e-9ff30608c249-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.639905 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.640080 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1df15eb2-fdba-49b3-831e-9ff30608c249-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.640206 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-dev\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.640327 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.640455 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.640603 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.640740 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.640893 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-sys\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.641113 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-sys\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.641259 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1df15eb2-fdba-49b3-831e-9ff30608c249-logs\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.641381 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf41e648-346e-4019-8a38-80d1508dd2d2-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.641528 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.641652 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1df15eb2-fdba-49b3-831e-9ff30608c249-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.641781 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf41e648-346e-4019-8a38-80d1508dd2d2-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.658898 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.659648 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.671534 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.677427 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.745113 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.745201 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.745219 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.745236 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.871364 5008 generic.go:334] "Generic (PLEG): container finished" podID="8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" containerID="f4ef23dce81c940e4b3b566f6de12c69a4f05aac6498107f3f55e41033094034" exitCode=0 Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.871474 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d","Type":"ContainerDied","Data":"f4ef23dce81c940e4b3b566f6de12c69a4f05aac6498107f3f55e41033094034"} Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.871518 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d","Type":"ContainerDied","Data":"8d0336f891bfc23eb086e87c912b63783365afd890be1db2a6f5160930ccefc4"} Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.871545 5008 scope.go:117] "RemoveContainer" containerID="f4ef23dce81c940e4b3b566f6de12c69a4f05aac6498107f3f55e41033094034" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.871741 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.879265 5008 generic.go:334] "Generic (PLEG): container finished" podID="1df15eb2-fdba-49b3-831e-9ff30608c249" containerID="de27d0158d9b4b33c7060b41d4fbacb1f18aeaae6e3d203c586e4cf10e0328c3" exitCode=0 Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.879333 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"1df15eb2-fdba-49b3-831e-9ff30608c249","Type":"ContainerDied","Data":"de27d0158d9b4b33c7060b41d4fbacb1f18aeaae6e3d203c586e4cf10e0328c3"} Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.879339 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.879362 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"1df15eb2-fdba-49b3-831e-9ff30608c249","Type":"ContainerDied","Data":"6df8db82837475a7f174f0e2de6607faeea55234c1d3bdf4422b96bfaf818d66"} Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.882640 5008 generic.go:334] "Generic (PLEG): container finished" podID="cf41e648-346e-4019-8a38-80d1508dd2d2" containerID="3e66c76616ad120c07ab33d111bed66ba7f2191bfb4d02dc3c1216191575dd72" exitCode=0 Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.882711 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"cf41e648-346e-4019-8a38-80d1508dd2d2","Type":"ContainerDied","Data":"3e66c76616ad120c07ab33d111bed66ba7f2191bfb4d02dc3c1216191575dd72"} Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.882773 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"cf41e648-346e-4019-8a38-80d1508dd2d2","Type":"ContainerDied","Data":"fc6827bfd6137d7fbb8fd960ffaec5795869fe1ace2bf337b088a897e7a300fe"} Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.882822 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.900992 5008 scope.go:117] "RemoveContainer" containerID="2d950477c7980bc552685cb0a2525fb49f76d35b18998ddd8e59a38e33507898" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.930727 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.940778 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.959173 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.964135 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.965583 5008 scope.go:117] "RemoveContainer" containerID="f4ef23dce81c940e4b3b566f6de12c69a4f05aac6498107f3f55e41033094034" Nov 26 23:00:10 crc kubenswrapper[5008]: E1126 23:00:10.966150 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4ef23dce81c940e4b3b566f6de12c69a4f05aac6498107f3f55e41033094034\": container with ID starting with f4ef23dce81c940e4b3b566f6de12c69a4f05aac6498107f3f55e41033094034 not found: ID does not exist" containerID="f4ef23dce81c940e4b3b566f6de12c69a4f05aac6498107f3f55e41033094034" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.966187 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ef23dce81c940e4b3b566f6de12c69a4f05aac6498107f3f55e41033094034"} err="failed to get container status \"f4ef23dce81c940e4b3b566f6de12c69a4f05aac6498107f3f55e41033094034\": rpc error: code = NotFound desc = could not find container \"f4ef23dce81c940e4b3b566f6de12c69a4f05aac6498107f3f55e41033094034\": container with ID starting with f4ef23dce81c940e4b3b566f6de12c69a4f05aac6498107f3f55e41033094034 not found: ID does not exist" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.966213 5008 scope.go:117] "RemoveContainer" containerID="2d950477c7980bc552685cb0a2525fb49f76d35b18998ddd8e59a38e33507898" Nov 26 23:00:10 crc kubenswrapper[5008]: E1126 23:00:10.966679 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d950477c7980bc552685cb0a2525fb49f76d35b18998ddd8e59a38e33507898\": container with ID starting with 2d950477c7980bc552685cb0a2525fb49f76d35b18998ddd8e59a38e33507898 not found: ID does not exist" containerID="2d950477c7980bc552685cb0a2525fb49f76d35b18998ddd8e59a38e33507898" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.966722 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d950477c7980bc552685cb0a2525fb49f76d35b18998ddd8e59a38e33507898"} err="failed to get container status \"2d950477c7980bc552685cb0a2525fb49f76d35b18998ddd8e59a38e33507898\": rpc error: code = NotFound desc = could not find container \"2d950477c7980bc552685cb0a2525fb49f76d35b18998ddd8e59a38e33507898\": container with ID starting with 2d950477c7980bc552685cb0a2525fb49f76d35b18998ddd8e59a38e33507898 not found: ID does not exist" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.966750 5008 scope.go:117] "RemoveContainer" containerID="de27d0158d9b4b33c7060b41d4fbacb1f18aeaae6e3d203c586e4cf10e0328c3" Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.968704 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 26 23:00:10 crc kubenswrapper[5008]: I1126 23:00:10.972991 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.007312 5008 scope.go:117] "RemoveContainer" containerID="7b46dde8bed842e1bea0c357187cbb0a262e57b11275fc456506add6e86bad9c" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.023636 5008 scope.go:117] "RemoveContainer" containerID="de27d0158d9b4b33c7060b41d4fbacb1f18aeaae6e3d203c586e4cf10e0328c3" Nov 26 23:00:11 crc kubenswrapper[5008]: E1126 23:00:11.023991 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de27d0158d9b4b33c7060b41d4fbacb1f18aeaae6e3d203c586e4cf10e0328c3\": container with ID starting with de27d0158d9b4b33c7060b41d4fbacb1f18aeaae6e3d203c586e4cf10e0328c3 not found: ID does not exist" containerID="de27d0158d9b4b33c7060b41d4fbacb1f18aeaae6e3d203c586e4cf10e0328c3" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.024027 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de27d0158d9b4b33c7060b41d4fbacb1f18aeaae6e3d203c586e4cf10e0328c3"} err="failed to get container status \"de27d0158d9b4b33c7060b41d4fbacb1f18aeaae6e3d203c586e4cf10e0328c3\": rpc error: code = NotFound desc = could not find container \"de27d0158d9b4b33c7060b41d4fbacb1f18aeaae6e3d203c586e4cf10e0328c3\": container with ID starting with de27d0158d9b4b33c7060b41d4fbacb1f18aeaae6e3d203c586e4cf10e0328c3 not found: ID does not exist" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.024053 5008 scope.go:117] "RemoveContainer" containerID="7b46dde8bed842e1bea0c357187cbb0a262e57b11275fc456506add6e86bad9c" Nov 26 23:00:11 crc kubenswrapper[5008]: E1126 23:00:11.024304 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b46dde8bed842e1bea0c357187cbb0a262e57b11275fc456506add6e86bad9c\": container with ID starting with 7b46dde8bed842e1bea0c357187cbb0a262e57b11275fc456506add6e86bad9c not found: ID does not exist" containerID="7b46dde8bed842e1bea0c357187cbb0a262e57b11275fc456506add6e86bad9c" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.024338 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b46dde8bed842e1bea0c357187cbb0a262e57b11275fc456506add6e86bad9c"} err="failed to get container status \"7b46dde8bed842e1bea0c357187cbb0a262e57b11275fc456506add6e86bad9c\": rpc error: code = NotFound desc = could not find container \"7b46dde8bed842e1bea0c357187cbb0a262e57b11275fc456506add6e86bad9c\": container with ID starting with 7b46dde8bed842e1bea0c357187cbb0a262e57b11275fc456506add6e86bad9c not found: ID does not exist" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.024358 5008 scope.go:117] "RemoveContainer" containerID="3e66c76616ad120c07ab33d111bed66ba7f2191bfb4d02dc3c1216191575dd72" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.051696 5008 scope.go:117] "RemoveContainer" containerID="f68ddbee848e96ccbbdf85288e5752b18e0e10c5a723022ca70176edb1042c55" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.070814 5008 scope.go:117] "RemoveContainer" containerID="3e66c76616ad120c07ab33d111bed66ba7f2191bfb4d02dc3c1216191575dd72" Nov 26 23:00:11 crc kubenswrapper[5008]: E1126 23:00:11.071308 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e66c76616ad120c07ab33d111bed66ba7f2191bfb4d02dc3c1216191575dd72\": container with ID starting with 3e66c76616ad120c07ab33d111bed66ba7f2191bfb4d02dc3c1216191575dd72 not found: ID does not exist" containerID="3e66c76616ad120c07ab33d111bed66ba7f2191bfb4d02dc3c1216191575dd72" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.071355 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e66c76616ad120c07ab33d111bed66ba7f2191bfb4d02dc3c1216191575dd72"} err="failed to get container status \"3e66c76616ad120c07ab33d111bed66ba7f2191bfb4d02dc3c1216191575dd72\": rpc error: code = NotFound desc = could not find container \"3e66c76616ad120c07ab33d111bed66ba7f2191bfb4d02dc3c1216191575dd72\": container with ID starting with 3e66c76616ad120c07ab33d111bed66ba7f2191bfb4d02dc3c1216191575dd72 not found: ID does not exist" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.071387 5008 scope.go:117] "RemoveContainer" containerID="f68ddbee848e96ccbbdf85288e5752b18e0e10c5a723022ca70176edb1042c55" Nov 26 23:00:11 crc kubenswrapper[5008]: E1126 23:00:11.072004 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f68ddbee848e96ccbbdf85288e5752b18e0e10c5a723022ca70176edb1042c55\": container with ID starting with f68ddbee848e96ccbbdf85288e5752b18e0e10c5a723022ca70176edb1042c55 not found: ID does not exist" containerID="f68ddbee848e96ccbbdf85288e5752b18e0e10c5a723022ca70176edb1042c55" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.072049 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68ddbee848e96ccbbdf85288e5752b18e0e10c5a723022ca70176edb1042c55"} err="failed to get container status \"f68ddbee848e96ccbbdf85288e5752b18e0e10c5a723022ca70176edb1042c55\": rpc error: code = NotFound desc = could not find container \"f68ddbee848e96ccbbdf85288e5752b18e0e10c5a723022ca70176edb1042c55\": container with ID starting with f68ddbee848e96ccbbdf85288e5752b18e0e10c5a723022ca70176edb1042c55 not found: ID does not exist" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.101188 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz"] Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.106395 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-1-cleaner-2940330wkxvz"] Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.111278 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g"] Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.121658 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-2-cleaner-29403309n24g"] Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.416325 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r"] Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.422482 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-2-cleaner-29403307x27r"] Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.533946 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c5b25d7-9f9f-4962-874e-01dc54b179a8" path="/var/lib/kubelet/pods/1c5b25d7-9f9f-4962-874e-01dc54b179a8/volumes" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.535844 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df15eb2-fdba-49b3-831e-9ff30608c249" path="/var/lib/kubelet/pods/1df15eb2-fdba-49b3-831e-9ff30608c249/volumes" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.537152 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="285f499e-c63a-45ed-9d95-9ad362cd5c69" path="/var/lib/kubelet/pods/285f499e-c63a-45ed-9d95-9ad362cd5c69/volumes" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.538494 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="434f550d-aa2c-4737-a324-236570bcd516" path="/var/lib/kubelet/pods/434f550d-aa2c-4737-a324-236570bcd516/volumes" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.540655 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" path="/var/lib/kubelet/pods/8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d/volumes" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.542115 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b1631a-df7b-468c-af5b-c2b122fc10c2" path="/var/lib/kubelet/pods/c7b1631a-df7b-468c-af5b-c2b122fc10c2/volumes" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.543895 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf41e648-346e-4019-8a38-80d1508dd2d2" path="/var/lib/kubelet/pods/cf41e648-346e-4019-8a38-80d1508dd2d2/volumes" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.546610 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93d1a20-d7fd-441f-a496-c3cb5423ba2c" path="/var/lib/kubelet/pods/f93d1a20-d7fd-441f-a496-c3cb5423ba2c/volumes" Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.997572 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.998085 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" containerName="glance-log" containerID="cri-o://8340eb89c10fc7062c2ee6b1c408bd2024cd8ff3bcdc683c14004139c13e8de7" gracePeriod=30 Nov 26 23:00:11 crc kubenswrapper[5008]: I1126 23:00:11.998238 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" containerName="glance-httpd" containerID="cri-o://036f02d70fe68dfdde18282da4ad181daf3d9d7c9b0efce30f4e75a31b5a061a" gracePeriod=30 Nov 26 23:00:12 crc kubenswrapper[5008]: I1126 23:00:12.570769 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 23:00:12 crc kubenswrapper[5008]: I1126 23:00:12.571471 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="8e4686af-ab67-4a04-a97f-48508ff06a2e" containerName="glance-log" containerID="cri-o://f749b687edb24b365e19d42c56ef725c0e6d7204c5df5c2b5f870d295f0b8764" gracePeriod=30 Nov 26 23:00:12 crc kubenswrapper[5008]: I1126 23:00:12.571866 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="8e4686af-ab67-4a04-a97f-48508ff06a2e" containerName="glance-httpd" containerID="cri-o://55ecd3c7ce870459cb822971b9f95c2d4717bf97e84ee95831d11acf572ca6e5" gracePeriod=30 Nov 26 23:00:12 crc kubenswrapper[5008]: I1126 23:00:12.914883 5008 generic.go:334] "Generic (PLEG): container finished" podID="c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" containerID="8340eb89c10fc7062c2ee6b1c408bd2024cd8ff3bcdc683c14004139c13e8de7" exitCode=143 Nov 26 23:00:12 crc kubenswrapper[5008]: I1126 23:00:12.915029 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff","Type":"ContainerDied","Data":"8340eb89c10fc7062c2ee6b1c408bd2024cd8ff3bcdc683c14004139c13e8de7"} Nov 26 23:00:12 crc kubenswrapper[5008]: I1126 23:00:12.918929 5008 generic.go:334] "Generic (PLEG): container finished" podID="8e4686af-ab67-4a04-a97f-48508ff06a2e" containerID="f749b687edb24b365e19d42c56ef725c0e6d7204c5df5c2b5f870d295f0b8764" exitCode=143 Nov 26 23:00:12 crc kubenswrapper[5008]: I1126 23:00:12.918989 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"8e4686af-ab67-4a04-a97f-48508ff06a2e","Type":"ContainerDied","Data":"f749b687edb24b365e19d42c56ef725c0e6d7204c5df5c2b5f870d295f0b8764"} Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.535823 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637242 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-config-data\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637283 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637329 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-logs\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637345 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-var-locks-brick\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637398 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-sys\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637425 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-etc-iscsi\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637437 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-dev\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637469 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-run\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637500 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ck8c\" (UniqueName: \"kubernetes.io/projected/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-kube-api-access-4ck8c\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637729 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-scripts\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637773 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-etc-nvme\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637807 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-httpd-run\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637843 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637867 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-lib-modules\") pod \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\" (UID: \"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff\") " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.637907 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.638091 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.638142 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.638219 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-logs" (OuterVolumeSpecName: "logs") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.638283 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-dev" (OuterVolumeSpecName: "dev") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.638308 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.638320 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-sys" (OuterVolumeSpecName: "sys") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.638376 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.638487 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-run" (OuterVolumeSpecName: "run") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.638718 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.638801 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-dev\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.638863 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.638918 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.639028 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.639085 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.639138 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-logs\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.639202 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.639256 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-sys\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.642740 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.646363 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-scripts" (OuterVolumeSpecName: "scripts") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.646698 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.654178 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-kube-api-access-4ck8c" (OuterVolumeSpecName: "kube-api-access-4ck8c") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "kube-api-access-4ck8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.680626 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-config-data" (OuterVolumeSpecName: "config-data") pod "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" (UID: "c4d1aa49-4ec3-4f0e-ae36-d0359d195fff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.740882 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ck8c\" (UniqueName: \"kubernetes.io/projected/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-kube-api-access-4ck8c\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.741162 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.741261 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.741321 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.741394 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.754674 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.756729 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.842919 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.842946 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.959065 5008 generic.go:334] "Generic (PLEG): container finished" podID="c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" containerID="036f02d70fe68dfdde18282da4ad181daf3d9d7c9b0efce30f4e75a31b5a061a" exitCode=0 Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.959137 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff","Type":"ContainerDied","Data":"036f02d70fe68dfdde18282da4ad181daf3d9d7c9b0efce30f4e75a31b5a061a"} Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.959168 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"c4d1aa49-4ec3-4f0e-ae36-d0359d195fff","Type":"ContainerDied","Data":"7e637a27f9f6b4ea8066077ef7f25f4bf77f2a3fa12f210ae4f75dcdd8b893c3"} Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.959190 5008 scope.go:117] "RemoveContainer" containerID="036f02d70fe68dfdde18282da4ad181daf3d9d7c9b0efce30f4e75a31b5a061a" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.959321 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.962409 5008 generic.go:334] "Generic (PLEG): container finished" podID="8e4686af-ab67-4a04-a97f-48508ff06a2e" containerID="55ecd3c7ce870459cb822971b9f95c2d4717bf97e84ee95831d11acf572ca6e5" exitCode=0 Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.962461 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"8e4686af-ab67-4a04-a97f-48508ff06a2e","Type":"ContainerDied","Data":"55ecd3c7ce870459cb822971b9f95c2d4717bf97e84ee95831d11acf572ca6e5"} Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.982996 5008 scope.go:117] "RemoveContainer" containerID="8340eb89c10fc7062c2ee6b1c408bd2024cd8ff3bcdc683c14004139c13e8de7" Nov 26 23:00:15 crc kubenswrapper[5008]: I1126 23:00:15.992586 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:15.999448 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.025284 5008 scope.go:117] "RemoveContainer" containerID="036f02d70fe68dfdde18282da4ad181daf3d9d7c9b0efce30f4e75a31b5a061a" Nov 26 23:00:16 crc kubenswrapper[5008]: E1126 23:00:16.026415 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"036f02d70fe68dfdde18282da4ad181daf3d9d7c9b0efce30f4e75a31b5a061a\": container with ID starting with 036f02d70fe68dfdde18282da4ad181daf3d9d7c9b0efce30f4e75a31b5a061a not found: ID does not exist" containerID="036f02d70fe68dfdde18282da4ad181daf3d9d7c9b0efce30f4e75a31b5a061a" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.026480 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"036f02d70fe68dfdde18282da4ad181daf3d9d7c9b0efce30f4e75a31b5a061a"} err="failed to get container status \"036f02d70fe68dfdde18282da4ad181daf3d9d7c9b0efce30f4e75a31b5a061a\": rpc error: code = NotFound desc = could not find container \"036f02d70fe68dfdde18282da4ad181daf3d9d7c9b0efce30f4e75a31b5a061a\": container with ID starting with 036f02d70fe68dfdde18282da4ad181daf3d9d7c9b0efce30f4e75a31b5a061a not found: ID does not exist" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.026506 5008 scope.go:117] "RemoveContainer" containerID="8340eb89c10fc7062c2ee6b1c408bd2024cd8ff3bcdc683c14004139c13e8de7" Nov 26 23:00:16 crc kubenswrapper[5008]: E1126 23:00:16.030291 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8340eb89c10fc7062c2ee6b1c408bd2024cd8ff3bcdc683c14004139c13e8de7\": container with ID starting with 8340eb89c10fc7062c2ee6b1c408bd2024cd8ff3bcdc683c14004139c13e8de7 not found: ID does not exist" containerID="8340eb89c10fc7062c2ee6b1c408bd2024cd8ff3bcdc683c14004139c13e8de7" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.030314 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8340eb89c10fc7062c2ee6b1c408bd2024cd8ff3bcdc683c14004139c13e8de7"} err="failed to get container status \"8340eb89c10fc7062c2ee6b1c408bd2024cd8ff3bcdc683c14004139c13e8de7\": rpc error: code = NotFound desc = could not find container \"8340eb89c10fc7062c2ee6b1c408bd2024cd8ff3bcdc683c14004139c13e8de7\": container with ID starting with 8340eb89c10fc7062c2ee6b1c408bd2024cd8ff3bcdc683c14004139c13e8de7 not found: ID does not exist" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.060210 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.145785 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8e4686af-ab67-4a04-a97f-48508ff06a2e-httpd-run\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.145816 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-run\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.145980 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-run" (OuterVolumeSpecName: "run") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.146052 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.146136 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e4686af-ab67-4a04-a97f-48508ff06a2e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.146643 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-dev\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.146678 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-dev" (OuterVolumeSpecName: "dev") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.146683 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-etc-iscsi\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.146719 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.146772 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-lib-modules\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.146858 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e4686af-ab67-4a04-a97f-48508ff06a2e-scripts\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.146873 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.146877 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.146919 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e4686af-ab67-4a04-a97f-48508ff06a2e-config-data\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.146957 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xddl\" (UniqueName: \"kubernetes.io/projected/8e4686af-ab67-4a04-a97f-48508ff06a2e-kube-api-access-8xddl\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.146998 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-sys\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147016 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-var-locks-brick\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147046 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e4686af-ab67-4a04-a97f-48508ff06a2e-logs\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147061 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-etc-nvme\") pod \"8e4686af-ab67-4a04-a97f-48508ff06a2e\" (UID: \"8e4686af-ab67-4a04-a97f-48508ff06a2e\") " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147355 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147420 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147616 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147622 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e4686af-ab67-4a04-a97f-48508ff06a2e-logs" (OuterVolumeSpecName: "logs") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147663 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-sys" (OuterVolumeSpecName: "sys") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147747 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8e4686af-ab67-4a04-a97f-48508ff06a2e-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147764 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-dev\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147782 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147797 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147814 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.147829 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.150296 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e4686af-ab67-4a04-a97f-48508ff06a2e-kube-api-access-8xddl" (OuterVolumeSpecName: "kube-api-access-8xddl") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "kube-api-access-8xddl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.150312 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance-cache") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.152294 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.156663 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4686af-ab67-4a04-a97f-48508ff06a2e-scripts" (OuterVolumeSpecName: "scripts") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.205274 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4686af-ab67-4a04-a97f-48508ff06a2e-config-data" (OuterVolumeSpecName: "config-data") pod "8e4686af-ab67-4a04-a97f-48508ff06a2e" (UID: "8e4686af-ab67-4a04-a97f-48508ff06a2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.249348 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e4686af-ab67-4a04-a97f-48508ff06a2e-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.249614 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.249628 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e4686af-ab67-4a04-a97f-48508ff06a2e-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.249637 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xddl\" (UniqueName: \"kubernetes.io/projected/8e4686af-ab67-4a04-a97f-48508ff06a2e-kube-api-access-8xddl\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.249649 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e4686af-ab67-4a04-a97f-48508ff06a2e-sys\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.249657 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e4686af-ab67-4a04-a97f-48508ff06a2e-logs\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.249671 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.261511 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.276329 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.350870 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.350917 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.371415 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp"] Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.378422 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940330g6twp"] Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.979210 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"8e4686af-ab67-4a04-a97f-48508ff06a2e","Type":"ContainerDied","Data":"789deabd240c3061a4ad9b95415eba75d1082ac206c50cfd73637d27e98ac84c"} Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.979309 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:00:16 crc kubenswrapper[5008]: I1126 23:00:16.979357 5008 scope.go:117] "RemoveContainer" containerID="55ecd3c7ce870459cb822971b9f95c2d4717bf97e84ee95831d11acf572ca6e5" Nov 26 23:00:17 crc kubenswrapper[5008]: I1126 23:00:17.026333 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 23:00:17 crc kubenswrapper[5008]: I1126 23:00:17.035619 5008 scope.go:117] "RemoveContainer" containerID="f749b687edb24b365e19d42c56ef725c0e6d7204c5df5c2b5f870d295f0b8764" Nov 26 23:00:17 crc kubenswrapper[5008]: I1126 23:00:17.036149 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 23:00:17 crc kubenswrapper[5008]: I1126 23:00:17.533634 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4efc663c-4d69-4417-b978-e705a3157b4e" path="/var/lib/kubelet/pods/4efc663c-4d69-4417-b978-e705a3157b4e/volumes" Nov 26 23:00:17 crc kubenswrapper[5008]: I1126 23:00:17.535246 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e4686af-ab67-4a04-a97f-48508ff06a2e" path="/var/lib/kubelet/pods/8e4686af-ab67-4a04-a97f-48508ff06a2e/volumes" Nov 26 23:00:17 crc kubenswrapper[5008]: I1126 23:00:17.536518 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" path="/var/lib/kubelet/pods/c4d1aa49-4ec3-4f0e-ae36-d0359d195fff/volumes" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.161203 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn"] Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.165042 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940330tv6wn"] Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.483814 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-gzdw5"] Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.493904 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-gzdw5"] Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.513798 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance6893-account-delete-xjb6d"] Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.514713 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285f499e-c63a-45ed-9d95-9ad362cd5c69" containerName="glance-cache-glance-default-external-api-1-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.514750 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="285f499e-c63a-45ed-9d95-9ad362cd5c69" containerName="glance-cache-glance-default-external-api-1-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.514768 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c5b25d7-9f9f-4962-874e-01dc54b179a8" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.514782 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c5b25d7-9f9f-4962-874e-01dc54b179a8" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.514812 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df15eb2-fdba-49b3-831e-9ff30608c249" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.514824 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df15eb2-fdba-49b3-831e-9ff30608c249" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.514849 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93d1a20-d7fd-441f-a496-c3cb5423ba2c" containerName="glance-cache-glance-default-external-api-2-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.514861 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93d1a20-d7fd-441f-a496-c3cb5423ba2c" containerName="glance-cache-glance-default-external-api-2-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.514882 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e4686af-ab67-4a04-a97f-48508ff06a2e" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.514894 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4686af-ab67-4a04-a97f-48508ff06a2e" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.514917 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efc663c-4d69-4417-b978-e705a3157b4e" containerName="glance-cache-glance-default-external-api-0-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.514930 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efc663c-4d69-4417-b978-e705a3157b4e" containerName="glance-cache-glance-default-external-api-0-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.514950 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e4686af-ab67-4a04-a97f-48508ff06a2e" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.514988 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4686af-ab67-4a04-a97f-48508ff06a2e" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.515015 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c5b25d7-9f9f-4962-874e-01dc54b179a8" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515027 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c5b25d7-9f9f-4962-874e-01dc54b179a8" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.515049 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515061 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.515082 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf41e648-346e-4019-8a38-80d1508dd2d2" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515093 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf41e648-346e-4019-8a38-80d1508dd2d2" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.515113 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515126 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.515149 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515162 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.515179 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5820a8-589a-4676-b36e-a1c90954ef93" containerName="collect-profiles" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515191 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5820a8-589a-4676-b36e-a1c90954ef93" containerName="collect-profiles" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.515212 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf41e648-346e-4019-8a38-80d1508dd2d2" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515224 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf41e648-346e-4019-8a38-80d1508dd2d2" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.515243 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ecc7bd-bb80-4362-8b26-d1f7f78b1455" containerName="glance-cache-glance-default-internal-api-0-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515260 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ecc7bd-bb80-4362-8b26-d1f7f78b1455" containerName="glance-cache-glance-default-internal-api-0-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.515288 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515301 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.515324 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df15eb2-fdba-49b3-831e-9ff30608c249" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515337 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df15eb2-fdba-49b3-831e-9ff30608c249" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.515358 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b1631a-df7b-468c-af5b-c2b122fc10c2" containerName="glance-cache-glance-default-internal-api-2-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515371 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b1631a-df7b-468c-af5b-c2b122fc10c2" containerName="glance-cache-glance-default-internal-api-2-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: E1126 23:00:18.515393 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434f550d-aa2c-4737-a324-236570bcd516" containerName="glance-cache-glance-default-internal-api-1-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515408 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="434f550d-aa2c-4737-a324-236570bcd516" containerName="glance-cache-glance-default-internal-api-1-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515609 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c5b25d7-9f9f-4962-874e-01dc54b179a8" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515627 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e4686af-ab67-4a04-a97f-48508ff06a2e" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515645 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf41e648-346e-4019-8a38-80d1508dd2d2" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515661 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c5b25d7-9f9f-4962-874e-01dc54b179a8" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515678 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b1631a-df7b-468c-af5b-c2b122fc10c2" containerName="glance-cache-glance-default-internal-api-2-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515691 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93d1a20-d7fd-441f-a496-c3cb5423ba2c" containerName="glance-cache-glance-default-external-api-2-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515705 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c5820a8-589a-4676-b36e-a1c90954ef93" containerName="collect-profiles" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515720 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515740 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ecc7bd-bb80-4362-8b26-d1f7f78b1455" containerName="glance-cache-glance-default-internal-api-0-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515761 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e4686af-ab67-4a04-a97f-48508ff06a2e" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515774 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4efc663c-4d69-4417-b978-e705a3157b4e" containerName="glance-cache-glance-default-external-api-0-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515798 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df15eb2-fdba-49b3-831e-9ff30608c249" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515811 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="434f550d-aa2c-4737-a324-236570bcd516" containerName="glance-cache-glance-default-internal-api-1-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515829 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515844 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f23a622-9f8e-42cb-b6a8-b8e2fce1e99d" containerName="glance-log" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515863 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf41e648-346e-4019-8a38-80d1508dd2d2" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515884 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df15eb2-fdba-49b3-831e-9ff30608c249" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515898 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="285f499e-c63a-45ed-9d95-9ad362cd5c69" containerName="glance-cache-glance-default-external-api-1-cleaner" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.515913 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d1aa49-4ec3-4f0e-ae36-d0359d195fff" containerName="glance-httpd" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.516658 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance6893-account-delete-xjb6d" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.520750 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance6893-account-delete-xjb6d"] Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.590364 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbnnp\" (UniqueName: \"kubernetes.io/projected/339a838b-ce0a-4c71-befe-1e551e37593c-kube-api-access-sbnnp\") pod \"glance6893-account-delete-xjb6d\" (UID: \"339a838b-ce0a-4c71-befe-1e551e37593c\") " pod="glance-kuttl-tests/glance6893-account-delete-xjb6d" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.590646 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/339a838b-ce0a-4c71-befe-1e551e37593c-operator-scripts\") pod \"glance6893-account-delete-xjb6d\" (UID: \"339a838b-ce0a-4c71-befe-1e551e37593c\") " pod="glance-kuttl-tests/glance6893-account-delete-xjb6d" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.691862 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbnnp\" (UniqueName: \"kubernetes.io/projected/339a838b-ce0a-4c71-befe-1e551e37593c-kube-api-access-sbnnp\") pod \"glance6893-account-delete-xjb6d\" (UID: \"339a838b-ce0a-4c71-befe-1e551e37593c\") " pod="glance-kuttl-tests/glance6893-account-delete-xjb6d" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.691930 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/339a838b-ce0a-4c71-befe-1e551e37593c-operator-scripts\") pod \"glance6893-account-delete-xjb6d\" (UID: \"339a838b-ce0a-4c71-befe-1e551e37593c\") " pod="glance-kuttl-tests/glance6893-account-delete-xjb6d" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.692615 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/339a838b-ce0a-4c71-befe-1e551e37593c-operator-scripts\") pod \"glance6893-account-delete-xjb6d\" (UID: \"339a838b-ce0a-4c71-befe-1e551e37593c\") " pod="glance-kuttl-tests/glance6893-account-delete-xjb6d" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.719664 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbnnp\" (UniqueName: \"kubernetes.io/projected/339a838b-ce0a-4c71-befe-1e551e37593c-kube-api-access-sbnnp\") pod \"glance6893-account-delete-xjb6d\" (UID: \"339a838b-ce0a-4c71-befe-1e551e37593c\") " pod="glance-kuttl-tests/glance6893-account-delete-xjb6d" Nov 26 23:00:18 crc kubenswrapper[5008]: I1126 23:00:18.836600 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance6893-account-delete-xjb6d" Nov 26 23:00:19 crc kubenswrapper[5008]: I1126 23:00:19.356037 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance6893-account-delete-xjb6d"] Nov 26 23:00:19 crc kubenswrapper[5008]: I1126 23:00:19.528099 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ecc7bd-bb80-4362-8b26-d1f7f78b1455" path="/var/lib/kubelet/pods/54ecc7bd-bb80-4362-8b26-d1f7f78b1455/volumes" Nov 26 23:00:19 crc kubenswrapper[5008]: I1126 23:00:19.529339 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="580f0ae9-1761-4d22-b98b-bd4af7c2752f" path="/var/lib/kubelet/pods/580f0ae9-1761-4d22-b98b-bd4af7c2752f/volumes" Nov 26 23:00:20 crc kubenswrapper[5008]: I1126 23:00:20.016662 5008 generic.go:334] "Generic (PLEG): container finished" podID="339a838b-ce0a-4c71-befe-1e551e37593c" containerID="9eb534a6b835c40a77cdea77c1255dcbfbec6e131e0911fdbf21bddfc7e2f38e" exitCode=0 Nov 26 23:00:20 crc kubenswrapper[5008]: I1126 23:00:20.016722 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance6893-account-delete-xjb6d" event={"ID":"339a838b-ce0a-4c71-befe-1e551e37593c","Type":"ContainerDied","Data":"9eb534a6b835c40a77cdea77c1255dcbfbec6e131e0911fdbf21bddfc7e2f38e"} Nov 26 23:00:20 crc kubenswrapper[5008]: I1126 23:00:20.016762 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance6893-account-delete-xjb6d" event={"ID":"339a838b-ce0a-4c71-befe-1e551e37593c","Type":"ContainerStarted","Data":"39345b85881ddd0c9734572d03bb9960083c04b1da88eed4252d2bce7b3241fc"} Nov 26 23:00:21 crc kubenswrapper[5008]: I1126 23:00:21.378294 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance6893-account-delete-xjb6d" Nov 26 23:00:21 crc kubenswrapper[5008]: I1126 23:00:21.437716 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/339a838b-ce0a-4c71-befe-1e551e37593c-operator-scripts\") pod \"339a838b-ce0a-4c71-befe-1e551e37593c\" (UID: \"339a838b-ce0a-4c71-befe-1e551e37593c\") " Nov 26 23:00:21 crc kubenswrapper[5008]: I1126 23:00:21.437775 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbnnp\" (UniqueName: \"kubernetes.io/projected/339a838b-ce0a-4c71-befe-1e551e37593c-kube-api-access-sbnnp\") pod \"339a838b-ce0a-4c71-befe-1e551e37593c\" (UID: \"339a838b-ce0a-4c71-befe-1e551e37593c\") " Nov 26 23:00:21 crc kubenswrapper[5008]: I1126 23:00:21.438897 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/339a838b-ce0a-4c71-befe-1e551e37593c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "339a838b-ce0a-4c71-befe-1e551e37593c" (UID: "339a838b-ce0a-4c71-befe-1e551e37593c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 23:00:21 crc kubenswrapper[5008]: I1126 23:00:21.444491 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/339a838b-ce0a-4c71-befe-1e551e37593c-kube-api-access-sbnnp" (OuterVolumeSpecName: "kube-api-access-sbnnp") pod "339a838b-ce0a-4c71-befe-1e551e37593c" (UID: "339a838b-ce0a-4c71-befe-1e551e37593c"). InnerVolumeSpecName "kube-api-access-sbnnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:21 crc kubenswrapper[5008]: I1126 23:00:21.539665 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/339a838b-ce0a-4c71-befe-1e551e37593c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:21 crc kubenswrapper[5008]: I1126 23:00:21.539705 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbnnp\" (UniqueName: \"kubernetes.io/projected/339a838b-ce0a-4c71-befe-1e551e37593c-kube-api-access-sbnnp\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:22 crc kubenswrapper[5008]: I1126 23:00:22.035310 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance6893-account-delete-xjb6d" event={"ID":"339a838b-ce0a-4c71-befe-1e551e37593c","Type":"ContainerDied","Data":"39345b85881ddd0c9734572d03bb9960083c04b1da88eed4252d2bce7b3241fc"} Nov 26 23:00:22 crc kubenswrapper[5008]: I1126 23:00:22.035346 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39345b85881ddd0c9734572d03bb9960083c04b1da88eed4252d2bce7b3241fc" Nov 26 23:00:22 crc kubenswrapper[5008]: I1126 23:00:22.035396 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance6893-account-delete-xjb6d" Nov 26 23:00:23 crc kubenswrapper[5008]: I1126 23:00:23.561607 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-584km"] Nov 26 23:00:23 crc kubenswrapper[5008]: I1126 23:00:23.573744 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-584km"] Nov 26 23:00:23 crc kubenswrapper[5008]: I1126 23:00:23.588674 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-6893-account-create-update-25vzs"] Nov 26 23:00:23 crc kubenswrapper[5008]: I1126 23:00:23.599327 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance6893-account-delete-xjb6d"] Nov 26 23:00:23 crc kubenswrapper[5008]: I1126 23:00:23.609461 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-6893-account-create-update-25vzs"] Nov 26 23:00:23 crc kubenswrapper[5008]: I1126 23:00:23.615495 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance6893-account-delete-xjb6d"] Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.363993 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-88296"] Nov 26 23:00:24 crc kubenswrapper[5008]: E1126 23:00:24.364470 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339a838b-ce0a-4c71-befe-1e551e37593c" containerName="mariadb-account-delete" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.364511 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="339a838b-ce0a-4c71-befe-1e551e37593c" containerName="mariadb-account-delete" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.364783 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="339a838b-ce0a-4c71-befe-1e551e37593c" containerName="mariadb-account-delete" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.365617 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-88296" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.371538 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-01d8-account-create-update-4xlrs"] Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.373338 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-01d8-account-create-update-4xlrs" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.375498 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.382358 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-01d8-account-create-update-4xlrs"] Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.391444 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-88296"] Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.481932 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fhh7\" (UniqueName: \"kubernetes.io/projected/56181be6-d505-4a52-b389-a96288fcb920-kube-api-access-4fhh7\") pod \"glance-db-create-88296\" (UID: \"56181be6-d505-4a52-b389-a96288fcb920\") " pod="glance-kuttl-tests/glance-db-create-88296" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.482035 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56181be6-d505-4a52-b389-a96288fcb920-operator-scripts\") pod \"glance-db-create-88296\" (UID: \"56181be6-d505-4a52-b389-a96288fcb920\") " pod="glance-kuttl-tests/glance-db-create-88296" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.482092 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5scs\" (UniqueName: \"kubernetes.io/projected/0baeb926-09fd-4832-a4ee-7bee3734bcf1-kube-api-access-p5scs\") pod \"glance-01d8-account-create-update-4xlrs\" (UID: \"0baeb926-09fd-4832-a4ee-7bee3734bcf1\") " pod="glance-kuttl-tests/glance-01d8-account-create-update-4xlrs" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.482160 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0baeb926-09fd-4832-a4ee-7bee3734bcf1-operator-scripts\") pod \"glance-01d8-account-create-update-4xlrs\" (UID: \"0baeb926-09fd-4832-a4ee-7bee3734bcf1\") " pod="glance-kuttl-tests/glance-01d8-account-create-update-4xlrs" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.582997 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5scs\" (UniqueName: \"kubernetes.io/projected/0baeb926-09fd-4832-a4ee-7bee3734bcf1-kube-api-access-p5scs\") pod \"glance-01d8-account-create-update-4xlrs\" (UID: \"0baeb926-09fd-4832-a4ee-7bee3734bcf1\") " pod="glance-kuttl-tests/glance-01d8-account-create-update-4xlrs" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.583049 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0baeb926-09fd-4832-a4ee-7bee3734bcf1-operator-scripts\") pod \"glance-01d8-account-create-update-4xlrs\" (UID: \"0baeb926-09fd-4832-a4ee-7bee3734bcf1\") " pod="glance-kuttl-tests/glance-01d8-account-create-update-4xlrs" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.583110 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fhh7\" (UniqueName: \"kubernetes.io/projected/56181be6-d505-4a52-b389-a96288fcb920-kube-api-access-4fhh7\") pod \"glance-db-create-88296\" (UID: \"56181be6-d505-4a52-b389-a96288fcb920\") " pod="glance-kuttl-tests/glance-db-create-88296" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.583143 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56181be6-d505-4a52-b389-a96288fcb920-operator-scripts\") pod \"glance-db-create-88296\" (UID: \"56181be6-d505-4a52-b389-a96288fcb920\") " pod="glance-kuttl-tests/glance-db-create-88296" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.583772 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56181be6-d505-4a52-b389-a96288fcb920-operator-scripts\") pod \"glance-db-create-88296\" (UID: \"56181be6-d505-4a52-b389-a96288fcb920\") " pod="glance-kuttl-tests/glance-db-create-88296" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.584560 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0baeb926-09fd-4832-a4ee-7bee3734bcf1-operator-scripts\") pod \"glance-01d8-account-create-update-4xlrs\" (UID: \"0baeb926-09fd-4832-a4ee-7bee3734bcf1\") " pod="glance-kuttl-tests/glance-01d8-account-create-update-4xlrs" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.608515 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fhh7\" (UniqueName: \"kubernetes.io/projected/56181be6-d505-4a52-b389-a96288fcb920-kube-api-access-4fhh7\") pod \"glance-db-create-88296\" (UID: \"56181be6-d505-4a52-b389-a96288fcb920\") " pod="glance-kuttl-tests/glance-db-create-88296" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.610697 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5scs\" (UniqueName: \"kubernetes.io/projected/0baeb926-09fd-4832-a4ee-7bee3734bcf1-kube-api-access-p5scs\") pod \"glance-01d8-account-create-update-4xlrs\" (UID: \"0baeb926-09fd-4832-a4ee-7bee3734bcf1\") " pod="glance-kuttl-tests/glance-01d8-account-create-update-4xlrs" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.706586 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-88296" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.727340 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-01d8-account-create-update-4xlrs" Nov 26 23:00:24 crc kubenswrapper[5008]: I1126 23:00:24.965614 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-88296"] Nov 26 23:00:25 crc kubenswrapper[5008]: I1126 23:00:25.032365 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-01d8-account-create-update-4xlrs"] Nov 26 23:00:25 crc kubenswrapper[5008]: W1126 23:00:25.044978 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0baeb926_09fd_4832_a4ee_7bee3734bcf1.slice/crio-d34196e73ac7258ecad8b1c2c0754e180ff9148d784316bb543804e0a9603af7 WatchSource:0}: Error finding container d34196e73ac7258ecad8b1c2c0754e180ff9148d784316bb543804e0a9603af7: Status 404 returned error can't find the container with id d34196e73ac7258ecad8b1c2c0754e180ff9148d784316bb543804e0a9603af7 Nov 26 23:00:25 crc kubenswrapper[5008]: I1126 23:00:25.064585 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-88296" event={"ID":"56181be6-d505-4a52-b389-a96288fcb920","Type":"ContainerStarted","Data":"f90fe2a50f499f9aadf9a9070339367c20e8e90b2180f1233985c278a6d02de1"} Nov 26 23:00:25 crc kubenswrapper[5008]: I1126 23:00:25.065974 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-01d8-account-create-update-4xlrs" event={"ID":"0baeb926-09fd-4832-a4ee-7bee3734bcf1","Type":"ContainerStarted","Data":"d34196e73ac7258ecad8b1c2c0754e180ff9148d784316bb543804e0a9603af7"} Nov 26 23:00:25 crc kubenswrapper[5008]: I1126 23:00:25.534716 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="339a838b-ce0a-4c71-befe-1e551e37593c" path="/var/lib/kubelet/pods/339a838b-ce0a-4c71-befe-1e551e37593c/volumes" Nov 26 23:00:25 crc kubenswrapper[5008]: I1126 23:00:25.536649 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca02e5f-27d8-4215-bb7c-491373544006" path="/var/lib/kubelet/pods/4ca02e5f-27d8-4215-bb7c-491373544006/volumes" Nov 26 23:00:25 crc kubenswrapper[5008]: I1126 23:00:25.537843 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a105ca6d-90e5-4f65-959e-59888265f31c" path="/var/lib/kubelet/pods/a105ca6d-90e5-4f65-959e-59888265f31c/volumes" Nov 26 23:00:26 crc kubenswrapper[5008]: I1126 23:00:26.072173 5008 generic.go:334] "Generic (PLEG): container finished" podID="56181be6-d505-4a52-b389-a96288fcb920" containerID="31f1bbea8aa9d61c26bc39c564ab146ebef489e0cfd8482db8a0e80c41e858ac" exitCode=0 Nov 26 23:00:26 crc kubenswrapper[5008]: I1126 23:00:26.072236 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-88296" event={"ID":"56181be6-d505-4a52-b389-a96288fcb920","Type":"ContainerDied","Data":"31f1bbea8aa9d61c26bc39c564ab146ebef489e0cfd8482db8a0e80c41e858ac"} Nov 26 23:00:26 crc kubenswrapper[5008]: I1126 23:00:26.073522 5008 generic.go:334] "Generic (PLEG): container finished" podID="0baeb926-09fd-4832-a4ee-7bee3734bcf1" containerID="56ce7211c234fa44549e2b2ebb917f11a110d575cf8467083faf941a340221f2" exitCode=0 Nov 26 23:00:26 crc kubenswrapper[5008]: I1126 23:00:26.073554 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-01d8-account-create-update-4xlrs" event={"ID":"0baeb926-09fd-4832-a4ee-7bee3734bcf1","Type":"ContainerDied","Data":"56ce7211c234fa44549e2b2ebb917f11a110d575cf8467083faf941a340221f2"} Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.481700 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-01d8-account-create-update-4xlrs" Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.486479 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-88296" Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.634379 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fhh7\" (UniqueName: \"kubernetes.io/projected/56181be6-d505-4a52-b389-a96288fcb920-kube-api-access-4fhh7\") pod \"56181be6-d505-4a52-b389-a96288fcb920\" (UID: \"56181be6-d505-4a52-b389-a96288fcb920\") " Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.634457 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0baeb926-09fd-4832-a4ee-7bee3734bcf1-operator-scripts\") pod \"0baeb926-09fd-4832-a4ee-7bee3734bcf1\" (UID: \"0baeb926-09fd-4832-a4ee-7bee3734bcf1\") " Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.634592 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56181be6-d505-4a52-b389-a96288fcb920-operator-scripts\") pod \"56181be6-d505-4a52-b389-a96288fcb920\" (UID: \"56181be6-d505-4a52-b389-a96288fcb920\") " Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.634700 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5scs\" (UniqueName: \"kubernetes.io/projected/0baeb926-09fd-4832-a4ee-7bee3734bcf1-kube-api-access-p5scs\") pod \"0baeb926-09fd-4832-a4ee-7bee3734bcf1\" (UID: \"0baeb926-09fd-4832-a4ee-7bee3734bcf1\") " Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.635713 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0baeb926-09fd-4832-a4ee-7bee3734bcf1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0baeb926-09fd-4832-a4ee-7bee3734bcf1" (UID: "0baeb926-09fd-4832-a4ee-7bee3734bcf1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.635788 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56181be6-d505-4a52-b389-a96288fcb920-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56181be6-d505-4a52-b389-a96288fcb920" (UID: "56181be6-d505-4a52-b389-a96288fcb920"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.640211 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0baeb926-09fd-4832-a4ee-7bee3734bcf1-kube-api-access-p5scs" (OuterVolumeSpecName: "kube-api-access-p5scs") pod "0baeb926-09fd-4832-a4ee-7bee3734bcf1" (UID: "0baeb926-09fd-4832-a4ee-7bee3734bcf1"). InnerVolumeSpecName "kube-api-access-p5scs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.642267 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56181be6-d505-4a52-b389-a96288fcb920-kube-api-access-4fhh7" (OuterVolumeSpecName: "kube-api-access-4fhh7") pod "56181be6-d505-4a52-b389-a96288fcb920" (UID: "56181be6-d505-4a52-b389-a96288fcb920"). InnerVolumeSpecName "kube-api-access-4fhh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.738174 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56181be6-d505-4a52-b389-a96288fcb920-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.738228 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5scs\" (UniqueName: \"kubernetes.io/projected/0baeb926-09fd-4832-a4ee-7bee3734bcf1-kube-api-access-p5scs\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.738250 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fhh7\" (UniqueName: \"kubernetes.io/projected/56181be6-d505-4a52-b389-a96288fcb920-kube-api-access-4fhh7\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:27 crc kubenswrapper[5008]: I1126 23:00:27.738268 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0baeb926-09fd-4832-a4ee-7bee3734bcf1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:28 crc kubenswrapper[5008]: I1126 23:00:28.093663 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-88296" event={"ID":"56181be6-d505-4a52-b389-a96288fcb920","Type":"ContainerDied","Data":"f90fe2a50f499f9aadf9a9070339367c20e8e90b2180f1233985c278a6d02de1"} Nov 26 23:00:28 crc kubenswrapper[5008]: I1126 23:00:28.094034 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f90fe2a50f499f9aadf9a9070339367c20e8e90b2180f1233985c278a6d02de1" Nov 26 23:00:28 crc kubenswrapper[5008]: I1126 23:00:28.093680 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-88296" Nov 26 23:00:28 crc kubenswrapper[5008]: I1126 23:00:28.097108 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-01d8-account-create-update-4xlrs" event={"ID":"0baeb926-09fd-4832-a4ee-7bee3734bcf1","Type":"ContainerDied","Data":"d34196e73ac7258ecad8b1c2c0754e180ff9148d784316bb543804e0a9603af7"} Nov 26 23:00:28 crc kubenswrapper[5008]: I1126 23:00:28.097148 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d34196e73ac7258ecad8b1c2c0754e180ff9148d784316bb543804e0a9603af7" Nov 26 23:00:28 crc kubenswrapper[5008]: I1126 23:00:28.097202 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-01d8-account-create-update-4xlrs" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.281713 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.281794 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.496485 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-md85d"] Nov 26 23:00:29 crc kubenswrapper[5008]: E1126 23:00:29.497035 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0baeb926-09fd-4832-a4ee-7bee3734bcf1" containerName="mariadb-account-create-update" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.497064 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0baeb926-09fd-4832-a4ee-7bee3734bcf1" containerName="mariadb-account-create-update" Nov 26 23:00:29 crc kubenswrapper[5008]: E1126 23:00:29.497081 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56181be6-d505-4a52-b389-a96288fcb920" containerName="mariadb-database-create" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.497099 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="56181be6-d505-4a52-b389-a96288fcb920" containerName="mariadb-database-create" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.497412 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="56181be6-d505-4a52-b389-a96288fcb920" containerName="mariadb-database-create" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.497475 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0baeb926-09fd-4832-a4ee-7bee3734bcf1" containerName="mariadb-account-create-update" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.498637 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-md85d" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.503341 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.504127 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-zg65s" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.515558 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-md85d"] Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.568935 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19002a58-e28b-4950-a48d-ba98820e57fe-config-data\") pod \"glance-db-sync-md85d\" (UID: \"19002a58-e28b-4950-a48d-ba98820e57fe\") " pod="glance-kuttl-tests/glance-db-sync-md85d" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.569081 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttffq\" (UniqueName: \"kubernetes.io/projected/19002a58-e28b-4950-a48d-ba98820e57fe-kube-api-access-ttffq\") pod \"glance-db-sync-md85d\" (UID: \"19002a58-e28b-4950-a48d-ba98820e57fe\") " pod="glance-kuttl-tests/glance-db-sync-md85d" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.569222 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19002a58-e28b-4950-a48d-ba98820e57fe-db-sync-config-data\") pod \"glance-db-sync-md85d\" (UID: \"19002a58-e28b-4950-a48d-ba98820e57fe\") " pod="glance-kuttl-tests/glance-db-sync-md85d" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.670395 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19002a58-e28b-4950-a48d-ba98820e57fe-db-sync-config-data\") pod \"glance-db-sync-md85d\" (UID: \"19002a58-e28b-4950-a48d-ba98820e57fe\") " pod="glance-kuttl-tests/glance-db-sync-md85d" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.670572 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19002a58-e28b-4950-a48d-ba98820e57fe-config-data\") pod \"glance-db-sync-md85d\" (UID: \"19002a58-e28b-4950-a48d-ba98820e57fe\") " pod="glance-kuttl-tests/glance-db-sync-md85d" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.670611 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttffq\" (UniqueName: \"kubernetes.io/projected/19002a58-e28b-4950-a48d-ba98820e57fe-kube-api-access-ttffq\") pod \"glance-db-sync-md85d\" (UID: \"19002a58-e28b-4950-a48d-ba98820e57fe\") " pod="glance-kuttl-tests/glance-db-sync-md85d" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.678389 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19002a58-e28b-4950-a48d-ba98820e57fe-config-data\") pod \"glance-db-sync-md85d\" (UID: \"19002a58-e28b-4950-a48d-ba98820e57fe\") " pod="glance-kuttl-tests/glance-db-sync-md85d" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.679238 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19002a58-e28b-4950-a48d-ba98820e57fe-db-sync-config-data\") pod \"glance-db-sync-md85d\" (UID: \"19002a58-e28b-4950-a48d-ba98820e57fe\") " pod="glance-kuttl-tests/glance-db-sync-md85d" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.704249 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttffq\" (UniqueName: \"kubernetes.io/projected/19002a58-e28b-4950-a48d-ba98820e57fe-kube-api-access-ttffq\") pod \"glance-db-sync-md85d\" (UID: \"19002a58-e28b-4950-a48d-ba98820e57fe\") " pod="glance-kuttl-tests/glance-db-sync-md85d" Nov 26 23:00:29 crc kubenswrapper[5008]: I1126 23:00:29.833595 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-md85d" Nov 26 23:00:30 crc kubenswrapper[5008]: I1126 23:00:30.341181 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-md85d"] Nov 26 23:00:30 crc kubenswrapper[5008]: W1126 23:00:30.347078 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19002a58_e28b_4950_a48d_ba98820e57fe.slice/crio-a806afbf359c906eae0b2c1d0057cb5df9fee557505250d4d3a8243d23cb0f3b WatchSource:0}: Error finding container a806afbf359c906eae0b2c1d0057cb5df9fee557505250d4d3a8243d23cb0f3b: Status 404 returned error can't find the container with id a806afbf359c906eae0b2c1d0057cb5df9fee557505250d4d3a8243d23cb0f3b Nov 26 23:00:31 crc kubenswrapper[5008]: I1126 23:00:31.136653 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-md85d" event={"ID":"19002a58-e28b-4950-a48d-ba98820e57fe","Type":"ContainerStarted","Data":"b7de7a9c065a6a4a0a844bde9a810a95eb224dc85f2d39b8157eb8fdee3c7c0a"} Nov 26 23:00:31 crc kubenswrapper[5008]: I1126 23:00:31.136698 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-md85d" event={"ID":"19002a58-e28b-4950-a48d-ba98820e57fe","Type":"ContainerStarted","Data":"a806afbf359c906eae0b2c1d0057cb5df9fee557505250d4d3a8243d23cb0f3b"} Nov 26 23:00:31 crc kubenswrapper[5008]: I1126 23:00:31.159914 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-md85d" podStartSLOduration=2.159895823 podStartE2EDuration="2.159895823s" podCreationTimestamp="2025-11-26 23:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:00:31.155074051 +0000 UTC m=+1306.567768063" watchObservedRunningTime="2025-11-26 23:00:31.159895823 +0000 UTC m=+1306.572589835" Nov 26 23:00:34 crc kubenswrapper[5008]: I1126 23:00:34.164856 5008 generic.go:334] "Generic (PLEG): container finished" podID="19002a58-e28b-4950-a48d-ba98820e57fe" containerID="b7de7a9c065a6a4a0a844bde9a810a95eb224dc85f2d39b8157eb8fdee3c7c0a" exitCode=0 Nov 26 23:00:34 crc kubenswrapper[5008]: I1126 23:00:34.165015 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-md85d" event={"ID":"19002a58-e28b-4950-a48d-ba98820e57fe","Type":"ContainerDied","Data":"b7de7a9c065a6a4a0a844bde9a810a95eb224dc85f2d39b8157eb8fdee3c7c0a"} Nov 26 23:00:35 crc kubenswrapper[5008]: I1126 23:00:35.617496 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-md85d" Nov 26 23:00:35 crc kubenswrapper[5008]: I1126 23:00:35.795031 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19002a58-e28b-4950-a48d-ba98820e57fe-config-data\") pod \"19002a58-e28b-4950-a48d-ba98820e57fe\" (UID: \"19002a58-e28b-4950-a48d-ba98820e57fe\") " Nov 26 23:00:35 crc kubenswrapper[5008]: I1126 23:00:35.795077 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19002a58-e28b-4950-a48d-ba98820e57fe-db-sync-config-data\") pod \"19002a58-e28b-4950-a48d-ba98820e57fe\" (UID: \"19002a58-e28b-4950-a48d-ba98820e57fe\") " Nov 26 23:00:35 crc kubenswrapper[5008]: I1126 23:00:35.795127 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttffq\" (UniqueName: \"kubernetes.io/projected/19002a58-e28b-4950-a48d-ba98820e57fe-kube-api-access-ttffq\") pod \"19002a58-e28b-4950-a48d-ba98820e57fe\" (UID: \"19002a58-e28b-4950-a48d-ba98820e57fe\") " Nov 26 23:00:35 crc kubenswrapper[5008]: I1126 23:00:35.806209 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19002a58-e28b-4950-a48d-ba98820e57fe-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "19002a58-e28b-4950-a48d-ba98820e57fe" (UID: "19002a58-e28b-4950-a48d-ba98820e57fe"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:35 crc kubenswrapper[5008]: I1126 23:00:35.806981 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19002a58-e28b-4950-a48d-ba98820e57fe-kube-api-access-ttffq" (OuterVolumeSpecName: "kube-api-access-ttffq") pod "19002a58-e28b-4950-a48d-ba98820e57fe" (UID: "19002a58-e28b-4950-a48d-ba98820e57fe"). InnerVolumeSpecName "kube-api-access-ttffq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:35 crc kubenswrapper[5008]: I1126 23:00:35.872107 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19002a58-e28b-4950-a48d-ba98820e57fe-config-data" (OuterVolumeSpecName: "config-data") pod "19002a58-e28b-4950-a48d-ba98820e57fe" (UID: "19002a58-e28b-4950-a48d-ba98820e57fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:35 crc kubenswrapper[5008]: I1126 23:00:35.896899 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19002a58-e28b-4950-a48d-ba98820e57fe-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:35 crc kubenswrapper[5008]: I1126 23:00:35.896953 5008 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19002a58-e28b-4950-a48d-ba98820e57fe-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:35 crc kubenswrapper[5008]: I1126 23:00:35.897037 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttffq\" (UniqueName: \"kubernetes.io/projected/19002a58-e28b-4950-a48d-ba98820e57fe-kube-api-access-ttffq\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.190266 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-md85d" event={"ID":"19002a58-e28b-4950-a48d-ba98820e57fe","Type":"ContainerDied","Data":"a806afbf359c906eae0b2c1d0057cb5df9fee557505250d4d3a8243d23cb0f3b"} Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.190313 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a806afbf359c906eae0b2c1d0057cb5df9fee557505250d4d3a8243d23cb0f3b" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.190355 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-md85d" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.588892 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 23:00:36 crc kubenswrapper[5008]: E1126 23:00:36.589208 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19002a58-e28b-4950-a48d-ba98820e57fe" containerName="glance-db-sync" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.589224 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="19002a58-e28b-4950-a48d-ba98820e57fe" containerName="glance-db-sync" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.589395 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="19002a58-e28b-4950-a48d-ba98820e57fe" containerName="glance-db-sync" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.590221 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.593157 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-zg65s" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.593294 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.593364 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.610738 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.715425 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-run\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.715468 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-dev\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.715546 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.715583 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-etc-nvme\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.715605 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.715626 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/755cbcfd-bdf9-4186-956e-2f6b6716ec86-httpd-run\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.715644 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.715659 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/755cbcfd-bdf9-4186-956e-2f6b6716ec86-logs\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.715701 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/755cbcfd-bdf9-4186-956e-2f6b6716ec86-scripts\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.715723 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755cbcfd-bdf9-4186-956e-2f6b6716ec86-config-data\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.715860 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.715939 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-lib-modules\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.716015 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt2rx\" (UniqueName: \"kubernetes.io/projected/755cbcfd-bdf9-4186-956e-2f6b6716ec86-kube-api-access-mt2rx\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.716045 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-sys\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.817344 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/755cbcfd-bdf9-4186-956e-2f6b6716ec86-scripts\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.817458 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755cbcfd-bdf9-4186-956e-2f6b6716ec86-config-data\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.817977 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.817492 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818228 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-lib-modules\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818253 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt2rx\" (UniqueName: \"kubernetes.io/projected/755cbcfd-bdf9-4186-956e-2f6b6716ec86-kube-api-access-mt2rx\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818269 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-sys\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818299 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-run\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818344 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-dev\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818360 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-lib-modules\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818378 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818413 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818444 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-sys\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818457 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-dev\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818488 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-etc-nvme\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818548 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-run\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818625 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818659 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-etc-nvme\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818667 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/755cbcfd-bdf9-4186-956e-2f6b6716ec86-httpd-run\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818702 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818726 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/755cbcfd-bdf9-4186-956e-2f6b6716ec86-logs\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818768 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.818873 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.819060 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/755cbcfd-bdf9-4186-956e-2f6b6716ec86-httpd-run\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.819100 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/755cbcfd-bdf9-4186-956e-2f6b6716ec86-logs\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.823643 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/755cbcfd-bdf9-4186-956e-2f6b6716ec86-scripts\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.824037 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755cbcfd-bdf9-4186-956e-2f6b6716ec86-config-data\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.836555 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt2rx\" (UniqueName: \"kubernetes.io/projected/755cbcfd-bdf9-4186-956e-2f6b6716ec86-kube-api-access-mt2rx\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.843304 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.845816 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:36 crc kubenswrapper[5008]: I1126 23:00:36.905478 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:37 crc kubenswrapper[5008]: I1126 23:00:37.344809 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 23:00:37 crc kubenswrapper[5008]: W1126 23:00:37.356372 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod755cbcfd_bdf9_4186_956e_2f6b6716ec86.slice/crio-569af2e9c484a26666dd27d5a16fc72a8d8b12964bc34c95d0da30935bbbf8f2 WatchSource:0}: Error finding container 569af2e9c484a26666dd27d5a16fc72a8d8b12964bc34c95d0da30935bbbf8f2: Status 404 returned error can't find the container with id 569af2e9c484a26666dd27d5a16fc72a8d8b12964bc34c95d0da30935bbbf8f2 Nov 26 23:00:37 crc kubenswrapper[5008]: I1126 23:00:37.450036 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.206921 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"755cbcfd-bdf9-4186-956e-2f6b6716ec86","Type":"ContainerStarted","Data":"51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13"} Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.207515 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"755cbcfd-bdf9-4186-956e-2f6b6716ec86","Type":"ContainerStarted","Data":"573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987"} Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.207537 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"755cbcfd-bdf9-4186-956e-2f6b6716ec86","Type":"ContainerStarted","Data":"569af2e9c484a26666dd27d5a16fc72a8d8b12964bc34c95d0da30935bbbf8f2"} Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.207168 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="755cbcfd-bdf9-4186-956e-2f6b6716ec86" containerName="glance-httpd" containerID="cri-o://51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13" gracePeriod=30 Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.207110 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="755cbcfd-bdf9-4186-956e-2f6b6716ec86" containerName="glance-log" containerID="cri-o://573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987" gracePeriod=30 Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.230735 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.230716432 podStartE2EDuration="2.230716432s" podCreationTimestamp="2025-11-26 23:00:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:00:38.227542022 +0000 UTC m=+1313.640236034" watchObservedRunningTime="2025-11-26 23:00:38.230716432 +0000 UTC m=+1313.643410444" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.742982 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:38 crc kubenswrapper[5008]: W1126 23:00:38.752356 5008 watcher.go:93] Error while processing event ("/sys/fs/cgroup/user.slice/user-0.slice/session-c48.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/user.slice/user-0.slice/session-c48.scope: no such file or directory Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.859633 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/755cbcfd-bdf9-4186-956e-2f6b6716ec86-httpd-run\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.859757 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-lib-modules\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.859840 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/755cbcfd-bdf9-4186-956e-2f6b6716ec86-scripts\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.859884 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-etc-nvme\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.859893 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.859916 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-run\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.859949 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt2rx\" (UniqueName: \"kubernetes.io/projected/755cbcfd-bdf9-4186-956e-2f6b6716ec86-kube-api-access-mt2rx\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.859955 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860015 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-run" (OuterVolumeSpecName: "run") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860016 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-dev\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860045 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/755cbcfd-bdf9-4186-956e-2f6b6716ec86-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860073 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-dev" (OuterVolumeSpecName: "dev") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860088 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-sys\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860126 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755cbcfd-bdf9-4186-956e-2f6b6716ec86-config-data\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860154 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860193 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-etc-iscsi\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860219 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-sys" (OuterVolumeSpecName: "sys") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860286 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/755cbcfd-bdf9-4186-956e-2f6b6716ec86-logs\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860329 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860368 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-var-locks-brick\") pod \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\" (UID: \"755cbcfd-bdf9-4186-956e-2f6b6716ec86\") " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860839 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.860908 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.861127 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.861143 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.861151 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.861159 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-dev\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.861166 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-sys\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.861173 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.861181 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/755cbcfd-bdf9-4186-956e-2f6b6716ec86-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.861229 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/755cbcfd-bdf9-4186-956e-2f6b6716ec86-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.861266 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/755cbcfd-bdf9-4186-956e-2f6b6716ec86-logs" (OuterVolumeSpecName: "logs") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.865601 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.867144 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755cbcfd-bdf9-4186-956e-2f6b6716ec86-kube-api-access-mt2rx" (OuterVolumeSpecName: "kube-api-access-mt2rx") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "kube-api-access-mt2rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.867160 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance-cache") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.884045 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755cbcfd-bdf9-4186-956e-2f6b6716ec86-scripts" (OuterVolumeSpecName: "scripts") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.918614 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755cbcfd-bdf9-4186-956e-2f6b6716ec86-config-data" (OuterVolumeSpecName: "config-data") pod "755cbcfd-bdf9-4186-956e-2f6b6716ec86" (UID: "755cbcfd-bdf9-4186-956e-2f6b6716ec86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.962708 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/755cbcfd-bdf9-4186-956e-2f6b6716ec86-logs\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.962796 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.962819 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/755cbcfd-bdf9-4186-956e-2f6b6716ec86-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.962840 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt2rx\" (UniqueName: \"kubernetes.io/projected/755cbcfd-bdf9-4186-956e-2f6b6716ec86-kube-api-access-mt2rx\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.962860 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755cbcfd-bdf9-4186-956e-2f6b6716ec86-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.962889 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.985925 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 26 23:00:38 crc kubenswrapper[5008]: I1126 23:00:38.993133 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.065140 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.065196 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.222793 5008 generic.go:334] "Generic (PLEG): container finished" podID="755cbcfd-bdf9-4186-956e-2f6b6716ec86" containerID="51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13" exitCode=143 Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.222846 5008 generic.go:334] "Generic (PLEG): container finished" podID="755cbcfd-bdf9-4186-956e-2f6b6716ec86" containerID="573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987" exitCode=143 Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.222875 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.222870 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"755cbcfd-bdf9-4186-956e-2f6b6716ec86","Type":"ContainerDied","Data":"51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13"} Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.223026 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"755cbcfd-bdf9-4186-956e-2f6b6716ec86","Type":"ContainerDied","Data":"573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987"} Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.223106 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"755cbcfd-bdf9-4186-956e-2f6b6716ec86","Type":"ContainerDied","Data":"569af2e9c484a26666dd27d5a16fc72a8d8b12964bc34c95d0da30935bbbf8f2"} Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.223119 5008 scope.go:117] "RemoveContainer" containerID="51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.257361 5008 scope.go:117] "RemoveContainer" containerID="573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.265693 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.282884 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.295824 5008 scope.go:117] "RemoveContainer" containerID="51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13" Nov 26 23:00:39 crc kubenswrapper[5008]: E1126 23:00:39.296673 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13\": container with ID starting with 51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13 not found: ID does not exist" containerID="51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.296725 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13"} err="failed to get container status \"51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13\": rpc error: code = NotFound desc = could not find container \"51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13\": container with ID starting with 51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13 not found: ID does not exist" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.296760 5008 scope.go:117] "RemoveContainer" containerID="573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987" Nov 26 23:00:39 crc kubenswrapper[5008]: E1126 23:00:39.297259 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987\": container with ID starting with 573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987 not found: ID does not exist" containerID="573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.297289 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987"} err="failed to get container status \"573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987\": rpc error: code = NotFound desc = could not find container \"573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987\": container with ID starting with 573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987 not found: ID does not exist" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.297309 5008 scope.go:117] "RemoveContainer" containerID="51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.297569 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13"} err="failed to get container status \"51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13\": rpc error: code = NotFound desc = could not find container \"51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13\": container with ID starting with 51211b3182b45e3ea27db0006e46c2d25bbaabc8950ca3bbbe2c0fb84b7c6b13 not found: ID does not exist" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.297594 5008 scope.go:117] "RemoveContainer" containerID="573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.298028 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987"} err="failed to get container status \"573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987\": rpc error: code = NotFound desc = could not find container \"573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987\": container with ID starting with 573ac90c48af4826034bbc6837e5a05699f31b8ebfc94809a94a93518f708987 not found: ID does not exist" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.298688 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 23:00:39 crc kubenswrapper[5008]: E1126 23:00:39.331768 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755cbcfd-bdf9-4186-956e-2f6b6716ec86" containerName="glance-log" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.332184 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="755cbcfd-bdf9-4186-956e-2f6b6716ec86" containerName="glance-log" Nov 26 23:00:39 crc kubenswrapper[5008]: E1126 23:00:39.332276 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755cbcfd-bdf9-4186-956e-2f6b6716ec86" containerName="glance-httpd" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.332286 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="755cbcfd-bdf9-4186-956e-2f6b6716ec86" containerName="glance-httpd" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.332549 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="755cbcfd-bdf9-4186-956e-2f6b6716ec86" containerName="glance-log" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.332568 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="755cbcfd-bdf9-4186-956e-2f6b6716ec86" containerName="glance-httpd" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.333424 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.333523 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.335865 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.336065 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-zg65s" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.336526 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.471748 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-etc-nvme\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.471816 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.471849 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k97wt\" (UniqueName: \"kubernetes.io/projected/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-kube-api-access-k97wt\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.471930 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-lib-modules\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.472079 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.472123 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-dev\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.472182 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-logs\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.472230 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-scripts\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.472262 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.472314 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-run\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.472384 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-httpd-run\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.472404 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-sys\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.472428 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-config-data\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.472457 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.529564 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755cbcfd-bdf9-4186-956e-2f6b6716ec86" path="/var/lib/kubelet/pods/755cbcfd-bdf9-4186-956e-2f6b6716ec86/volumes" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.573943 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-run\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574275 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-httpd-run\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574089 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-run\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574428 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-sys\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574542 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-config-data\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574587 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574665 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-etc-nvme\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574691 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574728 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k97wt\" (UniqueName: \"kubernetes.io/projected/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-kube-api-access-k97wt\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574808 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-lib-modules\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574870 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574894 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-dev\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574945 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-logs\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574991 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.575009 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-lib-modules\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.575040 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-dev\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.574806 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-etc-nvme\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.575004 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-scripts\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.575034 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-httpd-run\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.575041 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.575176 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.575241 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.575363 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.575520 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-sys\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.575604 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-logs\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.581616 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-scripts\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.584933 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-config-data\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.597909 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k97wt\" (UniqueName: \"kubernetes.io/projected/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-kube-api-access-k97wt\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.603047 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.612302 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.655600 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:39 crc kubenswrapper[5008]: I1126 23:00:39.930090 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 23:00:40 crc kubenswrapper[5008]: I1126 23:00:40.232938 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"248fba6a-3e3b-4d60-8124-c68f0c1be9f2","Type":"ContainerStarted","Data":"932f7eff37b874c7bf09b2a92298d9a6fce4aff48ee4524165789e8de44d4312"} Nov 26 23:00:40 crc kubenswrapper[5008]: I1126 23:00:40.233386 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"248fba6a-3e3b-4d60-8124-c68f0c1be9f2","Type":"ContainerStarted","Data":"e52dd40167f05ae499a461fe94e9848fc9aef38d27e82c9de44ec4098ee750c2"} Nov 26 23:00:41 crc kubenswrapper[5008]: I1126 23:00:41.250159 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"248fba6a-3e3b-4d60-8124-c68f0c1be9f2","Type":"ContainerStarted","Data":"c699c3c93e8e96522c4dc806bba6494c739afa3e78b889fd5aa1bb396cb3548a"} Nov 26 23:00:41 crc kubenswrapper[5008]: I1126 23:00:41.274655 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.274637818 podStartE2EDuration="2.274637818s" podCreationTimestamp="2025-11-26 23:00:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:00:41.270336274 +0000 UTC m=+1316.683030286" watchObservedRunningTime="2025-11-26 23:00:41.274637818 +0000 UTC m=+1316.687331820" Nov 26 23:00:49 crc kubenswrapper[5008]: I1126 23:00:49.656183 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:49 crc kubenswrapper[5008]: I1126 23:00:49.656546 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:49 crc kubenswrapper[5008]: I1126 23:00:49.688832 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:49 crc kubenswrapper[5008]: I1126 23:00:49.713959 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:50 crc kubenswrapper[5008]: I1126 23:00:50.358135 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:50 crc kubenswrapper[5008]: I1126 23:00:50.358194 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:52 crc kubenswrapper[5008]: I1126 23:00:52.321161 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:52 crc kubenswrapper[5008]: I1126 23:00:52.385534 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 23:00:52 crc kubenswrapper[5008]: I1126 23:00:52.401642 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.737822 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.740088 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.752779 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.754309 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.761307 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.768903 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.856817 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f167c5e-15f2-4d90-8599-c0494c20471a-httpd-run\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857117 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-httpd-run\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857135 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-sys\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857190 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-config-data\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857221 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-logs\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857237 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857255 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-etc-nvme\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857274 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-dev\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857293 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857312 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857331 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857348 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857364 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857384 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857411 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-dev\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857428 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbht2\" (UniqueName: \"kubernetes.io/projected/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-kube-api-access-rbht2\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857444 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-lib-modules\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857462 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-sys\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857478 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-run\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857494 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f167c5e-15f2-4d90-8599-c0494c20471a-config-data\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857509 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-lib-modules\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857525 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-scripts\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857543 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-run\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857560 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f167c5e-15f2-4d90-8599-c0494c20471a-logs\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857577 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-etc-nvme\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857592 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f167c5e-15f2-4d90-8599-c0494c20471a-scripts\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857607 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.857628 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b58j6\" (UniqueName: \"kubernetes.io/projected/9f167c5e-15f2-4d90-8599-c0494c20471a-kube-api-access-b58j6\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.959738 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-config-data\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.959797 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-logs\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.959829 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.959856 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-etc-nvme\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.959882 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-dev\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.959909 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.959934 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960021 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960083 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960109 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960132 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960171 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-dev\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960192 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbht2\" (UniqueName: \"kubernetes.io/projected/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-kube-api-access-rbht2\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960210 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-lib-modules\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960229 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-sys\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960253 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-run\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960275 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f167c5e-15f2-4d90-8599-c0494c20471a-config-data\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960297 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-lib-modules\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960313 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-scripts\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960334 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-run\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960355 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f167c5e-15f2-4d90-8599-c0494c20471a-logs\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960374 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-etc-nvme\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960391 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f167c5e-15f2-4d90-8599-c0494c20471a-scripts\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960406 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960436 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b58j6\" (UniqueName: \"kubernetes.io/projected/9f167c5e-15f2-4d90-8599-c0494c20471a-kube-api-access-b58j6\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960457 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f167c5e-15f2-4d90-8599-c0494c20471a-httpd-run\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960476 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-httpd-run\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960494 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-sys\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960569 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-sys\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.960930 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-logs\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961105 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-sys\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961189 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-lib-modules\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961197 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-etc-nvme\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961232 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-etc-nvme\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961252 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-run\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961283 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-dev\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961280 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961486 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f167c5e-15f2-4d90-8599-c0494c20471a-logs\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961691 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961802 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961826 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961884 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961904 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.961928 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.962016 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.962225 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-dev\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.962451 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f167c5e-15f2-4d90-8599-c0494c20471a-httpd-run\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.962485 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-run\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.962433 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-lib-modules\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.962922 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-httpd-run\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.966371 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-scripts\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.969366 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f167c5e-15f2-4d90-8599-c0494c20471a-scripts\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.975641 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-config-data\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.976775 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b58j6\" (UniqueName: \"kubernetes.io/projected/9f167c5e-15f2-4d90-8599-c0494c20471a-kube-api-access-b58j6\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.977664 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f167c5e-15f2-4d90-8599-c0494c20471a-config-data\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.981570 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbht2\" (UniqueName: \"kubernetes.io/projected/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-kube-api-access-rbht2\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.989488 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:55 crc kubenswrapper[5008]: I1126 23:00:55.993030 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:56 crc kubenswrapper[5008]: I1126 23:00:56.005270 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-1\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:56 crc kubenswrapper[5008]: I1126 23:00:56.053390 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-2\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:56 crc kubenswrapper[5008]: I1126 23:00:56.104605 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:00:56 crc kubenswrapper[5008]: I1126 23:00:56.114888 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:00:56 crc kubenswrapper[5008]: I1126 23:00:56.535762 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Nov 26 23:00:56 crc kubenswrapper[5008]: W1126 23:00:56.544468 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f167c5e_15f2_4d90_8599_c0494c20471a.slice/crio-cc2b002ca67ae0f8bf69bc376a373e3acfa92e20065bbcda4e17211b112b6099 WatchSource:0}: Error finding container cc2b002ca67ae0f8bf69bc376a373e3acfa92e20065bbcda4e17211b112b6099: Status 404 returned error can't find the container with id cc2b002ca67ae0f8bf69bc376a373e3acfa92e20065bbcda4e17211b112b6099 Nov 26 23:00:56 crc kubenswrapper[5008]: I1126 23:00:56.610333 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 23:00:56 crc kubenswrapper[5008]: W1126 23:00:56.617732 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc63b418d_bbe9_4f45_8a71_a8924dda5b1f.slice/crio-56e2da2636e3af229252318082f8a1d66d36b57dee54819736d458b0aec1b27d WatchSource:0}: Error finding container 56e2da2636e3af229252318082f8a1d66d36b57dee54819736d458b0aec1b27d: Status 404 returned error can't find the container with id 56e2da2636e3af229252318082f8a1d66d36b57dee54819736d458b0aec1b27d Nov 26 23:00:57 crc kubenswrapper[5008]: I1126 23:00:57.444367 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"9f167c5e-15f2-4d90-8599-c0494c20471a","Type":"ContainerStarted","Data":"d5fdd1562aac9270d23e1ad59d02b48260779b26a929899eef7a4920ccc99fed"} Nov 26 23:00:57 crc kubenswrapper[5008]: I1126 23:00:57.445016 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"9f167c5e-15f2-4d90-8599-c0494c20471a","Type":"ContainerStarted","Data":"260e14d1e4056fca335fa58cf3ac53afbfb6771eaff72a569223a7ed9c325420"} Nov 26 23:00:57 crc kubenswrapper[5008]: I1126 23:00:57.445032 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"9f167c5e-15f2-4d90-8599-c0494c20471a","Type":"ContainerStarted","Data":"cc2b002ca67ae0f8bf69bc376a373e3acfa92e20065bbcda4e17211b112b6099"} Nov 26 23:00:57 crc kubenswrapper[5008]: I1126 23:00:57.446387 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"c63b418d-bbe9-4f45-8a71-a8924dda5b1f","Type":"ContainerStarted","Data":"4c2780125bbbed792ca094404e74691edf41ffb8d4f24ad0632929b149bcbdc6"} Nov 26 23:00:57 crc kubenswrapper[5008]: I1126 23:00:57.446520 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"c63b418d-bbe9-4f45-8a71-a8924dda5b1f","Type":"ContainerStarted","Data":"0913464a6fd903ee60baf08db5adcb32882e22885be5a221edea228ab790dac9"} Nov 26 23:00:57 crc kubenswrapper[5008]: I1126 23:00:57.446534 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"c63b418d-bbe9-4f45-8a71-a8924dda5b1f","Type":"ContainerStarted","Data":"56e2da2636e3af229252318082f8a1d66d36b57dee54819736d458b0aec1b27d"} Nov 26 23:00:57 crc kubenswrapper[5008]: I1126 23:00:57.463333 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-2" podStartSLOduration=3.463318131 podStartE2EDuration="3.463318131s" podCreationTimestamp="2025-11-26 23:00:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:00:57.461449221 +0000 UTC m=+1332.874143223" watchObservedRunningTime="2025-11-26 23:00:57.463318131 +0000 UTC m=+1332.876012133" Nov 26 23:00:57 crc kubenswrapper[5008]: I1126 23:00:57.487579 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=3.487562462 podStartE2EDuration="3.487562462s" podCreationTimestamp="2025-11-26 23:00:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:00:57.487025225 +0000 UTC m=+1332.899719227" watchObservedRunningTime="2025-11-26 23:00:57.487562462 +0000 UTC m=+1332.900256474" Nov 26 23:00:59 crc kubenswrapper[5008]: I1126 23:00:59.281048 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 23:00:59 crc kubenswrapper[5008]: I1126 23:00:59.281485 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 23:00:59 crc kubenswrapper[5008]: I1126 23:00:59.281549 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" Nov 26 23:00:59 crc kubenswrapper[5008]: I1126 23:00:59.282547 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9fefa5cb673a1b6294b5289afba9c13ed72e7092fa22ba3e5f1ef5b55c16e305"} pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 23:00:59 crc kubenswrapper[5008]: I1126 23:00:59.282673 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" containerID="cri-o://9fefa5cb673a1b6294b5289afba9c13ed72e7092fa22ba3e5f1ef5b55c16e305" gracePeriod=600 Nov 26 23:00:59 crc kubenswrapper[5008]: I1126 23:00:59.474165 5008 generic.go:334] "Generic (PLEG): container finished" podID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerID="9fefa5cb673a1b6294b5289afba9c13ed72e7092fa22ba3e5f1ef5b55c16e305" exitCode=0 Nov 26 23:00:59 crc kubenswrapper[5008]: I1126 23:00:59.474929 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerDied","Data":"9fefa5cb673a1b6294b5289afba9c13ed72e7092fa22ba3e5f1ef5b55c16e305"} Nov 26 23:00:59 crc kubenswrapper[5008]: I1126 23:00:59.475251 5008 scope.go:117] "RemoveContainer" containerID="9fe41703f50fb88cb25a2da55af0500e1caf0de96a6d4c9a6682af82c31219e6" Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.133035 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-cron-29403301-7ndxk"] Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.134121 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.144921 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-cron-29403301-7ndxk"] Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.248610 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28476b64-892d-42b6-8b62-16735b92b9d9-fernet-keys\") pod \"keystone-cron-29403301-7ndxk\" (UID: \"28476b64-892d-42b6-8b62-16735b92b9d9\") " pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.248707 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28476b64-892d-42b6-8b62-16735b92b9d9-config-data\") pod \"keystone-cron-29403301-7ndxk\" (UID: \"28476b64-892d-42b6-8b62-16735b92b9d9\") " pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.248820 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq4fq\" (UniqueName: \"kubernetes.io/projected/28476b64-892d-42b6-8b62-16735b92b9d9-kube-api-access-qq4fq\") pod \"keystone-cron-29403301-7ndxk\" (UID: \"28476b64-892d-42b6-8b62-16735b92b9d9\") " pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.350200 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq4fq\" (UniqueName: \"kubernetes.io/projected/28476b64-892d-42b6-8b62-16735b92b9d9-kube-api-access-qq4fq\") pod \"keystone-cron-29403301-7ndxk\" (UID: \"28476b64-892d-42b6-8b62-16735b92b9d9\") " pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.350799 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28476b64-892d-42b6-8b62-16735b92b9d9-fernet-keys\") pod \"keystone-cron-29403301-7ndxk\" (UID: \"28476b64-892d-42b6-8b62-16735b92b9d9\") " pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.350959 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28476b64-892d-42b6-8b62-16735b92b9d9-config-data\") pod \"keystone-cron-29403301-7ndxk\" (UID: \"28476b64-892d-42b6-8b62-16735b92b9d9\") " pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.360243 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28476b64-892d-42b6-8b62-16735b92b9d9-fernet-keys\") pod \"keystone-cron-29403301-7ndxk\" (UID: \"28476b64-892d-42b6-8b62-16735b92b9d9\") " pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.361439 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28476b64-892d-42b6-8b62-16735b92b9d9-config-data\") pod \"keystone-cron-29403301-7ndxk\" (UID: \"28476b64-892d-42b6-8b62-16735b92b9d9\") " pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.387367 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq4fq\" (UniqueName: \"kubernetes.io/projected/28476b64-892d-42b6-8b62-16735b92b9d9-kube-api-access-qq4fq\") pod \"keystone-cron-29403301-7ndxk\" (UID: \"28476b64-892d-42b6-8b62-16735b92b9d9\") " pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.453181 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.503514 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" event={"ID":"8e558d58-c5ad-41f5-930f-36ac26b1a1ea","Type":"ContainerStarted","Data":"6d543501fd1766cb3ccb22d2ef07d77dd6f33442235b1e1d91f887fc3e2188b8"} Nov 26 23:01:00 crc kubenswrapper[5008]: I1126 23:01:00.919825 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-cron-29403301-7ndxk"] Nov 26 23:01:01 crc kubenswrapper[5008]: I1126 23:01:01.515631 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" event={"ID":"28476b64-892d-42b6-8b62-16735b92b9d9","Type":"ContainerStarted","Data":"56a84526b6d913e2ab753c549215566919fcada869531b7f3489c26c6f5c7cac"} Nov 26 23:01:01 crc kubenswrapper[5008]: I1126 23:01:01.516216 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" event={"ID":"28476b64-892d-42b6-8b62-16735b92b9d9","Type":"ContainerStarted","Data":"c157ff4fb1b2eba5d0470bb70916287a8003632f329fdd909140642c6badc752"} Nov 26 23:01:01 crc kubenswrapper[5008]: I1126 23:01:01.538465 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" podStartSLOduration=1.538440269 podStartE2EDuration="1.538440269s" podCreationTimestamp="2025-11-26 23:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:01:01.536483498 +0000 UTC m=+1336.949177510" watchObservedRunningTime="2025-11-26 23:01:01.538440269 +0000 UTC m=+1336.951134311" Nov 26 23:01:03 crc kubenswrapper[5008]: I1126 23:01:03.541465 5008 generic.go:334] "Generic (PLEG): container finished" podID="28476b64-892d-42b6-8b62-16735b92b9d9" containerID="56a84526b6d913e2ab753c549215566919fcada869531b7f3489c26c6f5c7cac" exitCode=0 Nov 26 23:01:03 crc kubenswrapper[5008]: I1126 23:01:03.542135 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" event={"ID":"28476b64-892d-42b6-8b62-16735b92b9d9","Type":"ContainerDied","Data":"56a84526b6d913e2ab753c549215566919fcada869531b7f3489c26c6f5c7cac"} Nov 26 23:01:04 crc kubenswrapper[5008]: I1126 23:01:04.865008 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" Nov 26 23:01:04 crc kubenswrapper[5008]: I1126 23:01:04.941459 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq4fq\" (UniqueName: \"kubernetes.io/projected/28476b64-892d-42b6-8b62-16735b92b9d9-kube-api-access-qq4fq\") pod \"28476b64-892d-42b6-8b62-16735b92b9d9\" (UID: \"28476b64-892d-42b6-8b62-16735b92b9d9\") " Nov 26 23:01:04 crc kubenswrapper[5008]: I1126 23:01:04.941673 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28476b64-892d-42b6-8b62-16735b92b9d9-fernet-keys\") pod \"28476b64-892d-42b6-8b62-16735b92b9d9\" (UID: \"28476b64-892d-42b6-8b62-16735b92b9d9\") " Nov 26 23:01:04 crc kubenswrapper[5008]: I1126 23:01:04.941707 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28476b64-892d-42b6-8b62-16735b92b9d9-config-data\") pod \"28476b64-892d-42b6-8b62-16735b92b9d9\" (UID: \"28476b64-892d-42b6-8b62-16735b92b9d9\") " Nov 26 23:01:04 crc kubenswrapper[5008]: I1126 23:01:04.947284 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28476b64-892d-42b6-8b62-16735b92b9d9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "28476b64-892d-42b6-8b62-16735b92b9d9" (UID: "28476b64-892d-42b6-8b62-16735b92b9d9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:04 crc kubenswrapper[5008]: I1126 23:01:04.947947 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28476b64-892d-42b6-8b62-16735b92b9d9-kube-api-access-qq4fq" (OuterVolumeSpecName: "kube-api-access-qq4fq") pod "28476b64-892d-42b6-8b62-16735b92b9d9" (UID: "28476b64-892d-42b6-8b62-16735b92b9d9"). InnerVolumeSpecName "kube-api-access-qq4fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:01:05 crc kubenswrapper[5008]: I1126 23:01:05.006825 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28476b64-892d-42b6-8b62-16735b92b9d9-config-data" (OuterVolumeSpecName: "config-data") pod "28476b64-892d-42b6-8b62-16735b92b9d9" (UID: "28476b64-892d-42b6-8b62-16735b92b9d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:05 crc kubenswrapper[5008]: I1126 23:01:05.044229 5008 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28476b64-892d-42b6-8b62-16735b92b9d9-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:05 crc kubenswrapper[5008]: I1126 23:01:05.044260 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28476b64-892d-42b6-8b62-16735b92b9d9-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:05 crc kubenswrapper[5008]: I1126 23:01:05.044269 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq4fq\" (UniqueName: \"kubernetes.io/projected/28476b64-892d-42b6-8b62-16735b92b9d9-kube-api-access-qq4fq\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:05 crc kubenswrapper[5008]: I1126 23:01:05.563284 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" event={"ID":"28476b64-892d-42b6-8b62-16735b92b9d9","Type":"ContainerDied","Data":"c157ff4fb1b2eba5d0470bb70916287a8003632f329fdd909140642c6badc752"} Nov 26 23:01:05 crc kubenswrapper[5008]: I1126 23:01:05.563598 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c157ff4fb1b2eba5d0470bb70916287a8003632f329fdd909140642c6badc752" Nov 26 23:01:05 crc kubenswrapper[5008]: I1126 23:01:05.563430 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29403301-7ndxk" Nov 26 23:01:06 crc kubenswrapper[5008]: I1126 23:01:06.104830 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:01:06 crc kubenswrapper[5008]: I1126 23:01:06.105434 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:01:06 crc kubenswrapper[5008]: I1126 23:01:06.115406 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:01:06 crc kubenswrapper[5008]: I1126 23:01:06.115451 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:01:06 crc kubenswrapper[5008]: I1126 23:01:06.147003 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:01:06 crc kubenswrapper[5008]: I1126 23:01:06.165174 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:01:06 crc kubenswrapper[5008]: I1126 23:01:06.171492 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:01:06 crc kubenswrapper[5008]: I1126 23:01:06.188327 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:01:06 crc kubenswrapper[5008]: I1126 23:01:06.576745 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:01:06 crc kubenswrapper[5008]: I1126 23:01:06.576802 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:01:06 crc kubenswrapper[5008]: I1126 23:01:06.576824 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:01:06 crc kubenswrapper[5008]: I1126 23:01:06.576842 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:01:08 crc kubenswrapper[5008]: I1126 23:01:08.530763 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:01:08 crc kubenswrapper[5008]: I1126 23:01:08.565336 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:01:08 crc kubenswrapper[5008]: I1126 23:01:08.809743 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:01:08 crc kubenswrapper[5008]: I1126 23:01:08.810185 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 23:01:08 crc kubenswrapper[5008]: I1126 23:01:08.812858 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:01:10 crc kubenswrapper[5008]: I1126 23:01:10.384794 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Nov 26 23:01:10 crc kubenswrapper[5008]: I1126 23:01:10.393757 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 23:01:10 crc kubenswrapper[5008]: I1126 23:01:10.610109 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="c63b418d-bbe9-4f45-8a71-a8924dda5b1f" containerName="glance-log" containerID="cri-o://0913464a6fd903ee60baf08db5adcb32882e22885be5a221edea228ab790dac9" gracePeriod=30 Nov 26 23:01:10 crc kubenswrapper[5008]: I1126 23:01:10.610181 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="c63b418d-bbe9-4f45-8a71-a8924dda5b1f" containerName="glance-httpd" containerID="cri-o://4c2780125bbbed792ca094404e74691edf41ffb8d4f24ad0632929b149bcbdc6" gracePeriod=30 Nov 26 23:01:10 crc kubenswrapper[5008]: I1126 23:01:10.610246 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="9f167c5e-15f2-4d90-8599-c0494c20471a" containerName="glance-log" containerID="cri-o://260e14d1e4056fca335fa58cf3ac53afbfb6771eaff72a569223a7ed9c325420" gracePeriod=30 Nov 26 23:01:10 crc kubenswrapper[5008]: I1126 23:01:10.610304 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="9f167c5e-15f2-4d90-8599-c0494c20471a" containerName="glance-httpd" containerID="cri-o://d5fdd1562aac9270d23e1ad59d02b48260779b26a929899eef7a4920ccc99fed" gracePeriod=30 Nov 26 23:01:11 crc kubenswrapper[5008]: I1126 23:01:11.622660 5008 generic.go:334] "Generic (PLEG): container finished" podID="9f167c5e-15f2-4d90-8599-c0494c20471a" containerID="260e14d1e4056fca335fa58cf3ac53afbfb6771eaff72a569223a7ed9c325420" exitCode=143 Nov 26 23:01:11 crc kubenswrapper[5008]: I1126 23:01:11.622857 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"9f167c5e-15f2-4d90-8599-c0494c20471a","Type":"ContainerDied","Data":"260e14d1e4056fca335fa58cf3ac53afbfb6771eaff72a569223a7ed9c325420"} Nov 26 23:01:11 crc kubenswrapper[5008]: I1126 23:01:11.626925 5008 generic.go:334] "Generic (PLEG): container finished" podID="c63b418d-bbe9-4f45-8a71-a8924dda5b1f" containerID="0913464a6fd903ee60baf08db5adcb32882e22885be5a221edea228ab790dac9" exitCode=143 Nov 26 23:01:11 crc kubenswrapper[5008]: I1126 23:01:11.626986 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"c63b418d-bbe9-4f45-8a71-a8924dda5b1f","Type":"ContainerDied","Data":"0913464a6fd903ee60baf08db5adcb32882e22885be5a221edea228ab790dac9"} Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.229106 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.285639 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313021 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbht2\" (UniqueName: \"kubernetes.io/projected/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-kube-api-access-rbht2\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313086 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-lib-modules\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313115 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-etc-nvme\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313136 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-config-data\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313179 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-logs\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313199 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-httpd-run\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313219 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313203 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313263 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-sys\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313357 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-sys" (OuterVolumeSpecName: "sys") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313689 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-logs" (OuterVolumeSpecName: "logs") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313816 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313822 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-scripts\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313878 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-etc-iscsi\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313917 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-run\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313948 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313982 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-dev\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314005 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-var-locks-brick\") pod \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\" (UID: \"c63b418d-bbe9-4f45-8a71-a8924dda5b1f\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.313831 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314102 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-run" (OuterVolumeSpecName: "run") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314142 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314220 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-dev" (OuterVolumeSpecName: "dev") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314256 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314594 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314616 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314627 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-dev\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314638 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314649 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314656 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314664 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-logs\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314671 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.314679 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-sys\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.318388 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance-cache") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.318423 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-kube-api-access-rbht2" (OuterVolumeSpecName: "kube-api-access-rbht2") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "kube-api-access-rbht2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.318529 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.320395 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-scripts" (OuterVolumeSpecName: "scripts") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.359502 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-config-data" (OuterVolumeSpecName: "config-data") pod "c63b418d-bbe9-4f45-8a71-a8924dda5b1f" (UID: "c63b418d-bbe9-4f45-8a71-a8924dda5b1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.415636 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f167c5e-15f2-4d90-8599-c0494c20471a-config-data\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.415692 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.415736 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.415762 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f167c5e-15f2-4d90-8599-c0494c20471a-scripts\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.415790 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-etc-nvme\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.415836 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-run\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.415863 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f167c5e-15f2-4d90-8599-c0494c20471a-logs\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.415922 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-etc-iscsi\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.415953 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b58j6\" (UniqueName: \"kubernetes.io/projected/9f167c5e-15f2-4d90-8599-c0494c20471a-kube-api-access-b58j6\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.416033 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-var-locks-brick\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.416064 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f167c5e-15f2-4d90-8599-c0494c20471a-httpd-run\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.416088 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-dev\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.416084 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.416154 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-sys\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.416179 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-lib-modules\") pod \"9f167c5e-15f2-4d90-8599-c0494c20471a\" (UID: \"9f167c5e-15f2-4d90-8599-c0494c20471a\") " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.416515 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.416538 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.416561 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.416574 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbht2\" (UniqueName: \"kubernetes.io/projected/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-kube-api-access-rbht2\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.416587 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c63b418d-bbe9-4f45-8a71-a8924dda5b1f-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.416603 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.417044 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f167c5e-15f2-4d90-8599-c0494c20471a-logs" (OuterVolumeSpecName: "logs") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.417136 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-run" (OuterVolumeSpecName: "run") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.417184 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.417224 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-dev" (OuterVolumeSpecName: "dev") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.417275 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.417729 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f167c5e-15f2-4d90-8599-c0494c20471a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.417804 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.417996 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-sys" (OuterVolumeSpecName: "sys") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.418422 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.419766 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.420393 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f167c5e-15f2-4d90-8599-c0494c20471a-kube-api-access-b58j6" (OuterVolumeSpecName: "kube-api-access-b58j6") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "kube-api-access-b58j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.420399 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f167c5e-15f2-4d90-8599-c0494c20471a-scripts" (OuterVolumeSpecName: "scripts") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.455305 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.458000 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.471372 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f167c5e-15f2-4d90-8599-c0494c20471a-config-data" (OuterVolumeSpecName: "config-data") pod "9f167c5e-15f2-4d90-8599-c0494c20471a" (UID: "9f167c5e-15f2-4d90-8599-c0494c20471a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.518169 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f167c5e-15f2-4d90-8599-c0494c20471a-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.518460 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.518593 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.518699 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.518780 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f167c5e-15f2-4d90-8599-c0494c20471a-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.518864 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.518941 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f167c5e-15f2-4d90-8599-c0494c20471a-logs\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.519047 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.519131 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.519208 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b58j6\" (UniqueName: \"kubernetes.io/projected/9f167c5e-15f2-4d90-8599-c0494c20471a-kube-api-access-b58j6\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.519286 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.519370 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f167c5e-15f2-4d90-8599-c0494c20471a-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.519452 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-dev\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.519532 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-sys\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.519609 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f167c5e-15f2-4d90-8599-c0494c20471a-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.534692 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.535759 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.620860 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.620893 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.660784 5008 generic.go:334] "Generic (PLEG): container finished" podID="9f167c5e-15f2-4d90-8599-c0494c20471a" containerID="d5fdd1562aac9270d23e1ad59d02b48260779b26a929899eef7a4920ccc99fed" exitCode=0 Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.660869 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.660889 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"9f167c5e-15f2-4d90-8599-c0494c20471a","Type":"ContainerDied","Data":"d5fdd1562aac9270d23e1ad59d02b48260779b26a929899eef7a4920ccc99fed"} Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.661522 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"9f167c5e-15f2-4d90-8599-c0494c20471a","Type":"ContainerDied","Data":"cc2b002ca67ae0f8bf69bc376a373e3acfa92e20065bbcda4e17211b112b6099"} Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.661555 5008 scope.go:117] "RemoveContainer" containerID="d5fdd1562aac9270d23e1ad59d02b48260779b26a929899eef7a4920ccc99fed" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.663566 5008 generic.go:334] "Generic (PLEG): container finished" podID="c63b418d-bbe9-4f45-8a71-a8924dda5b1f" containerID="4c2780125bbbed792ca094404e74691edf41ffb8d4f24ad0632929b149bcbdc6" exitCode=0 Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.663598 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"c63b418d-bbe9-4f45-8a71-a8924dda5b1f","Type":"ContainerDied","Data":"4c2780125bbbed792ca094404e74691edf41ffb8d4f24ad0632929b149bcbdc6"} Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.663618 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"c63b418d-bbe9-4f45-8a71-a8924dda5b1f","Type":"ContainerDied","Data":"56e2da2636e3af229252318082f8a1d66d36b57dee54819736d458b0aec1b27d"} Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.663636 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.686274 5008 scope.go:117] "RemoveContainer" containerID="260e14d1e4056fca335fa58cf3ac53afbfb6771eaff72a569223a7ed9c325420" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.705177 5008 scope.go:117] "RemoveContainer" containerID="d5fdd1562aac9270d23e1ad59d02b48260779b26a929899eef7a4920ccc99fed" Nov 26 23:01:14 crc kubenswrapper[5008]: E1126 23:01:14.705705 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5fdd1562aac9270d23e1ad59d02b48260779b26a929899eef7a4920ccc99fed\": container with ID starting with d5fdd1562aac9270d23e1ad59d02b48260779b26a929899eef7a4920ccc99fed not found: ID does not exist" containerID="d5fdd1562aac9270d23e1ad59d02b48260779b26a929899eef7a4920ccc99fed" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.705742 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5fdd1562aac9270d23e1ad59d02b48260779b26a929899eef7a4920ccc99fed"} err="failed to get container status \"d5fdd1562aac9270d23e1ad59d02b48260779b26a929899eef7a4920ccc99fed\": rpc error: code = NotFound desc = could not find container \"d5fdd1562aac9270d23e1ad59d02b48260779b26a929899eef7a4920ccc99fed\": container with ID starting with d5fdd1562aac9270d23e1ad59d02b48260779b26a929899eef7a4920ccc99fed not found: ID does not exist" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.705764 5008 scope.go:117] "RemoveContainer" containerID="260e14d1e4056fca335fa58cf3ac53afbfb6771eaff72a569223a7ed9c325420" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.705818 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 23:01:14 crc kubenswrapper[5008]: E1126 23:01:14.706191 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260e14d1e4056fca335fa58cf3ac53afbfb6771eaff72a569223a7ed9c325420\": container with ID starting with 260e14d1e4056fca335fa58cf3ac53afbfb6771eaff72a569223a7ed9c325420 not found: ID does not exist" containerID="260e14d1e4056fca335fa58cf3ac53afbfb6771eaff72a569223a7ed9c325420" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.706226 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260e14d1e4056fca335fa58cf3ac53afbfb6771eaff72a569223a7ed9c325420"} err="failed to get container status \"260e14d1e4056fca335fa58cf3ac53afbfb6771eaff72a569223a7ed9c325420\": rpc error: code = NotFound desc = could not find container \"260e14d1e4056fca335fa58cf3ac53afbfb6771eaff72a569223a7ed9c325420\": container with ID starting with 260e14d1e4056fca335fa58cf3ac53afbfb6771eaff72a569223a7ed9c325420 not found: ID does not exist" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.706250 5008 scope.go:117] "RemoveContainer" containerID="4c2780125bbbed792ca094404e74691edf41ffb8d4f24ad0632929b149bcbdc6" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.712712 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.724215 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.731858 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.743025 5008 scope.go:117] "RemoveContainer" containerID="0913464a6fd903ee60baf08db5adcb32882e22885be5a221edea228ab790dac9" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.760692 5008 scope.go:117] "RemoveContainer" containerID="4c2780125bbbed792ca094404e74691edf41ffb8d4f24ad0632929b149bcbdc6" Nov 26 23:01:14 crc kubenswrapper[5008]: E1126 23:01:14.761587 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c2780125bbbed792ca094404e74691edf41ffb8d4f24ad0632929b149bcbdc6\": container with ID starting with 4c2780125bbbed792ca094404e74691edf41ffb8d4f24ad0632929b149bcbdc6 not found: ID does not exist" containerID="4c2780125bbbed792ca094404e74691edf41ffb8d4f24ad0632929b149bcbdc6" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.761631 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2780125bbbed792ca094404e74691edf41ffb8d4f24ad0632929b149bcbdc6"} err="failed to get container status \"4c2780125bbbed792ca094404e74691edf41ffb8d4f24ad0632929b149bcbdc6\": rpc error: code = NotFound desc = could not find container \"4c2780125bbbed792ca094404e74691edf41ffb8d4f24ad0632929b149bcbdc6\": container with ID starting with 4c2780125bbbed792ca094404e74691edf41ffb8d4f24ad0632929b149bcbdc6 not found: ID does not exist" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.761658 5008 scope.go:117] "RemoveContainer" containerID="0913464a6fd903ee60baf08db5adcb32882e22885be5a221edea228ab790dac9" Nov 26 23:01:14 crc kubenswrapper[5008]: E1126 23:01:14.762145 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0913464a6fd903ee60baf08db5adcb32882e22885be5a221edea228ab790dac9\": container with ID starting with 0913464a6fd903ee60baf08db5adcb32882e22885be5a221edea228ab790dac9 not found: ID does not exist" containerID="0913464a6fd903ee60baf08db5adcb32882e22885be5a221edea228ab790dac9" Nov 26 23:01:14 crc kubenswrapper[5008]: I1126 23:01:14.762190 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0913464a6fd903ee60baf08db5adcb32882e22885be5a221edea228ab790dac9"} err="failed to get container status \"0913464a6fd903ee60baf08db5adcb32882e22885be5a221edea228ab790dac9\": rpc error: code = NotFound desc = could not find container \"0913464a6fd903ee60baf08db5adcb32882e22885be5a221edea228ab790dac9\": container with ID starting with 0913464a6fd903ee60baf08db5adcb32882e22885be5a221edea228ab790dac9 not found: ID does not exist" Nov 26 23:01:15 crc kubenswrapper[5008]: I1126 23:01:15.470225 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 23:01:15 crc kubenswrapper[5008]: I1126 23:01:15.470961 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="248fba6a-3e3b-4d60-8124-c68f0c1be9f2" containerName="glance-log" containerID="cri-o://932f7eff37b874c7bf09b2a92298d9a6fce4aff48ee4524165789e8de44d4312" gracePeriod=30 Nov 26 23:01:15 crc kubenswrapper[5008]: I1126 23:01:15.471154 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="248fba6a-3e3b-4d60-8124-c68f0c1be9f2" containerName="glance-httpd" containerID="cri-o://c699c3c93e8e96522c4dc806bba6494c739afa3e78b889fd5aa1bb396cb3548a" gracePeriod=30 Nov 26 23:01:15 crc kubenswrapper[5008]: I1126 23:01:15.532084 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f167c5e-15f2-4d90-8599-c0494c20471a" path="/var/lib/kubelet/pods/9f167c5e-15f2-4d90-8599-c0494c20471a/volumes" Nov 26 23:01:15 crc kubenswrapper[5008]: I1126 23:01:15.533459 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c63b418d-bbe9-4f45-8a71-a8924dda5b1f" path="/var/lib/kubelet/pods/c63b418d-bbe9-4f45-8a71-a8924dda5b1f/volumes" Nov 26 23:01:15 crc kubenswrapper[5008]: I1126 23:01:15.676186 5008 generic.go:334] "Generic (PLEG): container finished" podID="248fba6a-3e3b-4d60-8124-c68f0c1be9f2" containerID="932f7eff37b874c7bf09b2a92298d9a6fce4aff48ee4524165789e8de44d4312" exitCode=143 Nov 26 23:01:15 crc kubenswrapper[5008]: I1126 23:01:15.676268 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"248fba6a-3e3b-4d60-8124-c68f0c1be9f2","Type":"ContainerDied","Data":"932f7eff37b874c7bf09b2a92298d9a6fce4aff48ee4524165789e8de44d4312"} Nov 26 23:01:18 crc kubenswrapper[5008]: I1126 23:01:18.720649 5008 generic.go:334] "Generic (PLEG): container finished" podID="248fba6a-3e3b-4d60-8124-c68f0c1be9f2" containerID="c699c3c93e8e96522c4dc806bba6494c739afa3e78b889fd5aa1bb396cb3548a" exitCode=0 Nov 26 23:01:18 crc kubenswrapper[5008]: I1126 23:01:18.720743 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"248fba6a-3e3b-4d60-8124-c68f0c1be9f2","Type":"ContainerDied","Data":"c699c3c93e8e96522c4dc806bba6494c739afa3e78b889fd5aa1bb396cb3548a"} Nov 26 23:01:18 crc kubenswrapper[5008]: I1126 23:01:18.990308 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092417 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092495 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-config-data\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092518 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-dev\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092579 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-etc-nvme\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092605 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-sys\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092634 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-scripts\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092644 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-dev" (OuterVolumeSpecName: "dev") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092653 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092655 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-var-locks-brick\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092700 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092727 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-run\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092730 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-sys" (OuterVolumeSpecName: "sys") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092766 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-etc-iscsi\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092787 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-run" (OuterVolumeSpecName: "run") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092794 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-lib-modules\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092820 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-logs\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092819 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092840 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092859 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k97wt\" (UniqueName: \"kubernetes.io/projected/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-kube-api-access-k97wt\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.093181 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-httpd-run\") pod \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\" (UID: \"248fba6a-3e3b-4d60-8124-c68f0c1be9f2\") " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.092859 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.093505 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-logs" (OuterVolumeSpecName: "logs") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.093538 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.093798 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.093821 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.093835 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-logs\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.093846 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.093858 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-dev\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.093869 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.093879 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-sys\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.093890 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.093903 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.097638 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.097667 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance-cache") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.097643 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-scripts" (OuterVolumeSpecName: "scripts") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.098495 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-kube-api-access-k97wt" (OuterVolumeSpecName: "kube-api-access-k97wt") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "kube-api-access-k97wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.133332 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-config-data" (OuterVolumeSpecName: "config-data") pod "248fba6a-3e3b-4d60-8124-c68f0c1be9f2" (UID: "248fba6a-3e3b-4d60-8124-c68f0c1be9f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.194944 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.194989 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k97wt\" (UniqueName: \"kubernetes.io/projected/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-kube-api-access-k97wt\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.195012 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.195022 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.195031 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/248fba6a-3e3b-4d60-8124-c68f0c1be9f2-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.207619 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.213220 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.296532 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.296738 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.730580 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"248fba6a-3e3b-4d60-8124-c68f0c1be9f2","Type":"ContainerDied","Data":"e52dd40167f05ae499a461fe94e9848fc9aef38d27e82c9de44ec4098ee750c2"} Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.732081 5008 scope.go:117] "RemoveContainer" containerID="c699c3c93e8e96522c4dc806bba6494c739afa3e78b889fd5aa1bb396cb3548a" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.730677 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.756767 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.763042 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 23:01:19 crc kubenswrapper[5008]: I1126 23:01:19.766153 5008 scope.go:117] "RemoveContainer" containerID="932f7eff37b874c7bf09b2a92298d9a6fce4aff48ee4524165789e8de44d4312" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.873767 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-md85d"] Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.881147 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-md85d"] Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918331 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance01d8-account-delete-6lvnn"] Nov 26 23:01:20 crc kubenswrapper[5008]: E1126 23:01:20.918581 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28476b64-892d-42b6-8b62-16735b92b9d9" containerName="keystone-cron" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918593 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="28476b64-892d-42b6-8b62-16735b92b9d9" containerName="keystone-cron" Nov 26 23:01:20 crc kubenswrapper[5008]: E1126 23:01:20.918605 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f167c5e-15f2-4d90-8599-c0494c20471a" containerName="glance-log" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918610 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f167c5e-15f2-4d90-8599-c0494c20471a" containerName="glance-log" Nov 26 23:01:20 crc kubenswrapper[5008]: E1126 23:01:20.918624 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="248fba6a-3e3b-4d60-8124-c68f0c1be9f2" containerName="glance-log" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918630 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="248fba6a-3e3b-4d60-8124-c68f0c1be9f2" containerName="glance-log" Nov 26 23:01:20 crc kubenswrapper[5008]: E1126 23:01:20.918641 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c63b418d-bbe9-4f45-8a71-a8924dda5b1f" containerName="glance-httpd" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918646 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63b418d-bbe9-4f45-8a71-a8924dda5b1f" containerName="glance-httpd" Nov 26 23:01:20 crc kubenswrapper[5008]: E1126 23:01:20.918657 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f167c5e-15f2-4d90-8599-c0494c20471a" containerName="glance-httpd" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918664 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f167c5e-15f2-4d90-8599-c0494c20471a" containerName="glance-httpd" Nov 26 23:01:20 crc kubenswrapper[5008]: E1126 23:01:20.918676 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c63b418d-bbe9-4f45-8a71-a8924dda5b1f" containerName="glance-log" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918681 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63b418d-bbe9-4f45-8a71-a8924dda5b1f" containerName="glance-log" Nov 26 23:01:20 crc kubenswrapper[5008]: E1126 23:01:20.918692 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="248fba6a-3e3b-4d60-8124-c68f0c1be9f2" containerName="glance-httpd" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918697 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="248fba6a-3e3b-4d60-8124-c68f0c1be9f2" containerName="glance-httpd" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918803 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c63b418d-bbe9-4f45-8a71-a8924dda5b1f" containerName="glance-httpd" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918813 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="28476b64-892d-42b6-8b62-16735b92b9d9" containerName="keystone-cron" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918824 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f167c5e-15f2-4d90-8599-c0494c20471a" containerName="glance-httpd" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918832 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="248fba6a-3e3b-4d60-8124-c68f0c1be9f2" containerName="glance-log" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918842 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="248fba6a-3e3b-4d60-8124-c68f0c1be9f2" containerName="glance-httpd" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918850 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c63b418d-bbe9-4f45-8a71-a8924dda5b1f" containerName="glance-log" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.918862 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f167c5e-15f2-4d90-8599-c0494c20471a" containerName="glance-log" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.919281 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance01d8-account-delete-6lvnn" Nov 26 23:01:20 crc kubenswrapper[5008]: I1126 23:01:20.938703 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance01d8-account-delete-6lvnn"] Nov 26 23:01:21 crc kubenswrapper[5008]: I1126 23:01:21.020401 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzf7p\" (UniqueName: \"kubernetes.io/projected/329b9c09-9454-473f-9353-e4431802c407-kube-api-access-wzf7p\") pod \"glance01d8-account-delete-6lvnn\" (UID: \"329b9c09-9454-473f-9353-e4431802c407\") " pod="glance-kuttl-tests/glance01d8-account-delete-6lvnn" Nov 26 23:01:21 crc kubenswrapper[5008]: I1126 23:01:21.020487 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329b9c09-9454-473f-9353-e4431802c407-operator-scripts\") pod \"glance01d8-account-delete-6lvnn\" (UID: \"329b9c09-9454-473f-9353-e4431802c407\") " pod="glance-kuttl-tests/glance01d8-account-delete-6lvnn" Nov 26 23:01:21 crc kubenswrapper[5008]: I1126 23:01:21.121526 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329b9c09-9454-473f-9353-e4431802c407-operator-scripts\") pod \"glance01d8-account-delete-6lvnn\" (UID: \"329b9c09-9454-473f-9353-e4431802c407\") " pod="glance-kuttl-tests/glance01d8-account-delete-6lvnn" Nov 26 23:01:21 crc kubenswrapper[5008]: I1126 23:01:21.121695 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzf7p\" (UniqueName: \"kubernetes.io/projected/329b9c09-9454-473f-9353-e4431802c407-kube-api-access-wzf7p\") pod \"glance01d8-account-delete-6lvnn\" (UID: \"329b9c09-9454-473f-9353-e4431802c407\") " pod="glance-kuttl-tests/glance01d8-account-delete-6lvnn" Nov 26 23:01:21 crc kubenswrapper[5008]: I1126 23:01:21.122787 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329b9c09-9454-473f-9353-e4431802c407-operator-scripts\") pod \"glance01d8-account-delete-6lvnn\" (UID: \"329b9c09-9454-473f-9353-e4431802c407\") " pod="glance-kuttl-tests/glance01d8-account-delete-6lvnn" Nov 26 23:01:21 crc kubenswrapper[5008]: I1126 23:01:21.139293 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzf7p\" (UniqueName: \"kubernetes.io/projected/329b9c09-9454-473f-9353-e4431802c407-kube-api-access-wzf7p\") pod \"glance01d8-account-delete-6lvnn\" (UID: \"329b9c09-9454-473f-9353-e4431802c407\") " pod="glance-kuttl-tests/glance01d8-account-delete-6lvnn" Nov 26 23:01:21 crc kubenswrapper[5008]: I1126 23:01:21.244931 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance01d8-account-delete-6lvnn" Nov 26 23:01:21 crc kubenswrapper[5008]: I1126 23:01:21.526709 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19002a58-e28b-4950-a48d-ba98820e57fe" path="/var/lib/kubelet/pods/19002a58-e28b-4950-a48d-ba98820e57fe/volumes" Nov 26 23:01:21 crc kubenswrapper[5008]: I1126 23:01:21.527944 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="248fba6a-3e3b-4d60-8124-c68f0c1be9f2" path="/var/lib/kubelet/pods/248fba6a-3e3b-4d60-8124-c68f0c1be9f2/volumes" Nov 26 23:01:21 crc kubenswrapper[5008]: I1126 23:01:21.706061 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance01d8-account-delete-6lvnn"] Nov 26 23:01:21 crc kubenswrapper[5008]: I1126 23:01:21.746580 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance01d8-account-delete-6lvnn" event={"ID":"329b9c09-9454-473f-9353-e4431802c407","Type":"ContainerStarted","Data":"5fbb39a24be2a0665ff6ba0dad4f0d9414a1aead13a65ae65844061cd80ef755"} Nov 26 23:01:22 crc kubenswrapper[5008]: I1126 23:01:22.758040 5008 generic.go:334] "Generic (PLEG): container finished" podID="329b9c09-9454-473f-9353-e4431802c407" containerID="3bcd23b6bff3049d8cf209ae76c5b2bebfb89a56548f0b9af956f6c9751147e2" exitCode=0 Nov 26 23:01:22 crc kubenswrapper[5008]: I1126 23:01:22.758463 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance01d8-account-delete-6lvnn" event={"ID":"329b9c09-9454-473f-9353-e4431802c407","Type":"ContainerDied","Data":"3bcd23b6bff3049d8cf209ae76c5b2bebfb89a56548f0b9af956f6c9751147e2"} Nov 26 23:01:24 crc kubenswrapper[5008]: I1126 23:01:24.157541 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance01d8-account-delete-6lvnn" Nov 26 23:01:24 crc kubenswrapper[5008]: I1126 23:01:24.269688 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329b9c09-9454-473f-9353-e4431802c407-operator-scripts\") pod \"329b9c09-9454-473f-9353-e4431802c407\" (UID: \"329b9c09-9454-473f-9353-e4431802c407\") " Nov 26 23:01:24 crc kubenswrapper[5008]: I1126 23:01:24.269749 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzf7p\" (UniqueName: \"kubernetes.io/projected/329b9c09-9454-473f-9353-e4431802c407-kube-api-access-wzf7p\") pod \"329b9c09-9454-473f-9353-e4431802c407\" (UID: \"329b9c09-9454-473f-9353-e4431802c407\") " Nov 26 23:01:24 crc kubenswrapper[5008]: I1126 23:01:24.270618 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/329b9c09-9454-473f-9353-e4431802c407-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "329b9c09-9454-473f-9353-e4431802c407" (UID: "329b9c09-9454-473f-9353-e4431802c407"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 23:01:24 crc kubenswrapper[5008]: I1126 23:01:24.275161 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329b9c09-9454-473f-9353-e4431802c407-kube-api-access-wzf7p" (OuterVolumeSpecName: "kube-api-access-wzf7p") pod "329b9c09-9454-473f-9353-e4431802c407" (UID: "329b9c09-9454-473f-9353-e4431802c407"). InnerVolumeSpecName "kube-api-access-wzf7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:01:24 crc kubenswrapper[5008]: I1126 23:01:24.371694 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329b9c09-9454-473f-9353-e4431802c407-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:24 crc kubenswrapper[5008]: I1126 23:01:24.371728 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzf7p\" (UniqueName: \"kubernetes.io/projected/329b9c09-9454-473f-9353-e4431802c407-kube-api-access-wzf7p\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:24 crc kubenswrapper[5008]: I1126 23:01:24.788864 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance01d8-account-delete-6lvnn" event={"ID":"329b9c09-9454-473f-9353-e4431802c407","Type":"ContainerDied","Data":"5fbb39a24be2a0665ff6ba0dad4f0d9414a1aead13a65ae65844061cd80ef755"} Nov 26 23:01:24 crc kubenswrapper[5008]: I1126 23:01:24.788925 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fbb39a24be2a0665ff6ba0dad4f0d9414a1aead13a65ae65844061cd80ef755" Nov 26 23:01:24 crc kubenswrapper[5008]: I1126 23:01:24.789033 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance01d8-account-delete-6lvnn" Nov 26 23:01:25 crc kubenswrapper[5008]: I1126 23:01:25.957657 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-88296"] Nov 26 23:01:25 crc kubenswrapper[5008]: I1126 23:01:25.967453 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-88296"] Nov 26 23:01:25 crc kubenswrapper[5008]: I1126 23:01:25.994607 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-01d8-account-create-update-4xlrs"] Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.004505 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-01d8-account-create-update-4xlrs"] Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.012782 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance01d8-account-delete-6lvnn"] Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.017948 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance01d8-account-delete-6lvnn"] Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.150035 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-6hmqn"] Nov 26 23:01:26 crc kubenswrapper[5008]: E1126 23:01:26.150399 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329b9c09-9454-473f-9353-e4431802c407" containerName="mariadb-account-delete" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.150420 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="329b9c09-9454-473f-9353-e4431802c407" containerName="mariadb-account-delete" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.150592 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="329b9c09-9454-473f-9353-e4431802c407" containerName="mariadb-account-delete" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.151196 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-6hmqn" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.165472 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-bcdb-account-create-update-bf972"] Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.166437 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-6hmqn"] Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.166825 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-bcdb-account-create-update-bf972" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.168923 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-bcdb-account-create-update-bf972"] Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.171697 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.199132 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t7rn\" (UniqueName: \"kubernetes.io/projected/532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d-kube-api-access-9t7rn\") pod \"glance-db-create-6hmqn\" (UID: \"532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d\") " pod="glance-kuttl-tests/glance-db-create-6hmqn" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.199192 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d-operator-scripts\") pod \"glance-db-create-6hmqn\" (UID: \"532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d\") " pod="glance-kuttl-tests/glance-db-create-6hmqn" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.300095 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfj8x\" (UniqueName: \"kubernetes.io/projected/080f59b8-3df3-4094-8731-926a84ccbfa3-kube-api-access-lfj8x\") pod \"glance-bcdb-account-create-update-bf972\" (UID: \"080f59b8-3df3-4094-8731-926a84ccbfa3\") " pod="glance-kuttl-tests/glance-bcdb-account-create-update-bf972" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.300294 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t7rn\" (UniqueName: \"kubernetes.io/projected/532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d-kube-api-access-9t7rn\") pod \"glance-db-create-6hmqn\" (UID: \"532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d\") " pod="glance-kuttl-tests/glance-db-create-6hmqn" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.300371 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d-operator-scripts\") pod \"glance-db-create-6hmqn\" (UID: \"532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d\") " pod="glance-kuttl-tests/glance-db-create-6hmqn" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.300407 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/080f59b8-3df3-4094-8731-926a84ccbfa3-operator-scripts\") pod \"glance-bcdb-account-create-update-bf972\" (UID: \"080f59b8-3df3-4094-8731-926a84ccbfa3\") " pod="glance-kuttl-tests/glance-bcdb-account-create-update-bf972" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.301396 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d-operator-scripts\") pod \"glance-db-create-6hmqn\" (UID: \"532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d\") " pod="glance-kuttl-tests/glance-db-create-6hmqn" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.320251 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t7rn\" (UniqueName: \"kubernetes.io/projected/532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d-kube-api-access-9t7rn\") pod \"glance-db-create-6hmqn\" (UID: \"532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d\") " pod="glance-kuttl-tests/glance-db-create-6hmqn" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.402241 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/080f59b8-3df3-4094-8731-926a84ccbfa3-operator-scripts\") pod \"glance-bcdb-account-create-update-bf972\" (UID: \"080f59b8-3df3-4094-8731-926a84ccbfa3\") " pod="glance-kuttl-tests/glance-bcdb-account-create-update-bf972" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.402345 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfj8x\" (UniqueName: \"kubernetes.io/projected/080f59b8-3df3-4094-8731-926a84ccbfa3-kube-api-access-lfj8x\") pod \"glance-bcdb-account-create-update-bf972\" (UID: \"080f59b8-3df3-4094-8731-926a84ccbfa3\") " pod="glance-kuttl-tests/glance-bcdb-account-create-update-bf972" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.403438 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/080f59b8-3df3-4094-8731-926a84ccbfa3-operator-scripts\") pod \"glance-bcdb-account-create-update-bf972\" (UID: \"080f59b8-3df3-4094-8731-926a84ccbfa3\") " pod="glance-kuttl-tests/glance-bcdb-account-create-update-bf972" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.424399 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfj8x\" (UniqueName: \"kubernetes.io/projected/080f59b8-3df3-4094-8731-926a84ccbfa3-kube-api-access-lfj8x\") pod \"glance-bcdb-account-create-update-bf972\" (UID: \"080f59b8-3df3-4094-8731-926a84ccbfa3\") " pod="glance-kuttl-tests/glance-bcdb-account-create-update-bf972" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.487460 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-6hmqn" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.499197 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-bcdb-account-create-update-bf972" Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.805760 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-bcdb-account-create-update-bf972"] Nov 26 23:01:26 crc kubenswrapper[5008]: W1126 23:01:26.809221 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod080f59b8_3df3_4094_8731_926a84ccbfa3.slice/crio-fc39d401fa5443e619a29b4763974b07bde81fa5a3f2757f287a9aa4b7064c82 WatchSource:0}: Error finding container fc39d401fa5443e619a29b4763974b07bde81fa5a3f2757f287a9aa4b7064c82: Status 404 returned error can't find the container with id fc39d401fa5443e619a29b4763974b07bde81fa5a3f2757f287a9aa4b7064c82 Nov 26 23:01:26 crc kubenswrapper[5008]: I1126 23:01:26.866792 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-6hmqn"] Nov 26 23:01:26 crc kubenswrapper[5008]: W1126 23:01:26.876745 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod532919a3_8ed5_46d7_8d1c_ccfbb85a1f1d.slice/crio-480b760410104077de361d4061d640c36ed54204756c96e4d52cb2a483cc6f0a WatchSource:0}: Error finding container 480b760410104077de361d4061d640c36ed54204756c96e4d52cb2a483cc6f0a: Status 404 returned error can't find the container with id 480b760410104077de361d4061d640c36ed54204756c96e4d52cb2a483cc6f0a Nov 26 23:01:27 crc kubenswrapper[5008]: I1126 23:01:27.530351 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0baeb926-09fd-4832-a4ee-7bee3734bcf1" path="/var/lib/kubelet/pods/0baeb926-09fd-4832-a4ee-7bee3734bcf1/volumes" Nov 26 23:01:27 crc kubenswrapper[5008]: I1126 23:01:27.531714 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329b9c09-9454-473f-9353-e4431802c407" path="/var/lib/kubelet/pods/329b9c09-9454-473f-9353-e4431802c407/volumes" Nov 26 23:01:27 crc kubenswrapper[5008]: I1126 23:01:27.532515 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56181be6-d505-4a52-b389-a96288fcb920" path="/var/lib/kubelet/pods/56181be6-d505-4a52-b389-a96288fcb920/volumes" Nov 26 23:01:27 crc kubenswrapper[5008]: I1126 23:01:27.824825 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-bcdb-account-create-update-bf972" event={"ID":"080f59b8-3df3-4094-8731-926a84ccbfa3","Type":"ContainerDied","Data":"ac7caade32c4753ee6487bde265e2893ba3e2a5356d4ed4f16d3faa482096286"} Nov 26 23:01:27 crc kubenswrapper[5008]: I1126 23:01:27.824634 5008 generic.go:334] "Generic (PLEG): container finished" podID="080f59b8-3df3-4094-8731-926a84ccbfa3" containerID="ac7caade32c4753ee6487bde265e2893ba3e2a5356d4ed4f16d3faa482096286" exitCode=0 Nov 26 23:01:27 crc kubenswrapper[5008]: I1126 23:01:27.825344 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-bcdb-account-create-update-bf972" event={"ID":"080f59b8-3df3-4094-8731-926a84ccbfa3","Type":"ContainerStarted","Data":"fc39d401fa5443e619a29b4763974b07bde81fa5a3f2757f287a9aa4b7064c82"} Nov 26 23:01:27 crc kubenswrapper[5008]: I1126 23:01:27.827575 5008 generic.go:334] "Generic (PLEG): container finished" podID="532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d" containerID="e581f6bf5f1b1ebba98f132108850c6889a29bf097e4999912b986c64cf90d66" exitCode=0 Nov 26 23:01:27 crc kubenswrapper[5008]: I1126 23:01:27.827737 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-6hmqn" event={"ID":"532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d","Type":"ContainerDied","Data":"e581f6bf5f1b1ebba98f132108850c6889a29bf097e4999912b986c64cf90d66"} Nov 26 23:01:27 crc kubenswrapper[5008]: I1126 23:01:27.827769 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-6hmqn" event={"ID":"532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d","Type":"ContainerStarted","Data":"480b760410104077de361d4061d640c36ed54204756c96e4d52cb2a483cc6f0a"} Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.273073 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-bcdb-account-create-update-bf972" Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.279376 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-6hmqn" Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.348752 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfj8x\" (UniqueName: \"kubernetes.io/projected/080f59b8-3df3-4094-8731-926a84ccbfa3-kube-api-access-lfj8x\") pod \"080f59b8-3df3-4094-8731-926a84ccbfa3\" (UID: \"080f59b8-3df3-4094-8731-926a84ccbfa3\") " Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.348879 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d-operator-scripts\") pod \"532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d\" (UID: \"532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d\") " Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.348926 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/080f59b8-3df3-4094-8731-926a84ccbfa3-operator-scripts\") pod \"080f59b8-3df3-4094-8731-926a84ccbfa3\" (UID: \"080f59b8-3df3-4094-8731-926a84ccbfa3\") " Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.349072 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t7rn\" (UniqueName: \"kubernetes.io/projected/532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d-kube-api-access-9t7rn\") pod \"532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d\" (UID: \"532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d\") " Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.349912 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/080f59b8-3df3-4094-8731-926a84ccbfa3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "080f59b8-3df3-4094-8731-926a84ccbfa3" (UID: "080f59b8-3df3-4094-8731-926a84ccbfa3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.349916 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d" (UID: "532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.355079 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d-kube-api-access-9t7rn" (OuterVolumeSpecName: "kube-api-access-9t7rn") pod "532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d" (UID: "532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d"). InnerVolumeSpecName "kube-api-access-9t7rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.356262 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/080f59b8-3df3-4094-8731-926a84ccbfa3-kube-api-access-lfj8x" (OuterVolumeSpecName: "kube-api-access-lfj8x") pod "080f59b8-3df3-4094-8731-926a84ccbfa3" (UID: "080f59b8-3df3-4094-8731-926a84ccbfa3"). InnerVolumeSpecName "kube-api-access-lfj8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.451776 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfj8x\" (UniqueName: \"kubernetes.io/projected/080f59b8-3df3-4094-8731-926a84ccbfa3-kube-api-access-lfj8x\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.451823 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.451843 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/080f59b8-3df3-4094-8731-926a84ccbfa3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.451866 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t7rn\" (UniqueName: \"kubernetes.io/projected/532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d-kube-api-access-9t7rn\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.851822 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-bcdb-account-create-update-bf972" event={"ID":"080f59b8-3df3-4094-8731-926a84ccbfa3","Type":"ContainerDied","Data":"fc39d401fa5443e619a29b4763974b07bde81fa5a3f2757f287a9aa4b7064c82"} Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.852212 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc39d401fa5443e619a29b4763974b07bde81fa5a3f2757f287a9aa4b7064c82" Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.852295 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-bcdb-account-create-update-bf972" Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.855827 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-6hmqn" event={"ID":"532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d","Type":"ContainerDied","Data":"480b760410104077de361d4061d640c36ed54204756c96e4d52cb2a483cc6f0a"} Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.856401 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="480b760410104077de361d4061d640c36ed54204756c96e4d52cb2a483cc6f0a" Nov 26 23:01:29 crc kubenswrapper[5008]: I1126 23:01:29.856238 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-6hmqn" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.345405 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-lzbxn"] Nov 26 23:01:31 crc kubenswrapper[5008]: E1126 23:01:31.345842 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080f59b8-3df3-4094-8731-926a84ccbfa3" containerName="mariadb-account-create-update" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.345863 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="080f59b8-3df3-4094-8731-926a84ccbfa3" containerName="mariadb-account-create-update" Nov 26 23:01:31 crc kubenswrapper[5008]: E1126 23:01:31.345923 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d" containerName="mariadb-database-create" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.345943 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d" containerName="mariadb-database-create" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.346299 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="080f59b8-3df3-4094-8731-926a84ccbfa3" containerName="mariadb-account-create-update" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.346319 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="532919a3-8ed5-46d7-8d1c-ccfbb85a1f1d" containerName="mariadb-database-create" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.347070 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-lzbxn" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.350845 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-tdgbt" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.351836 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.376268 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-lzbxn"] Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.486479 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-db-sync-config-data\") pod \"glance-db-sync-lzbxn\" (UID: \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\") " pod="glance-kuttl-tests/glance-db-sync-lzbxn" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.486544 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-config-data\") pod \"glance-db-sync-lzbxn\" (UID: \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\") " pod="glance-kuttl-tests/glance-db-sync-lzbxn" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.486595 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q22zx\" (UniqueName: \"kubernetes.io/projected/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-kube-api-access-q22zx\") pod \"glance-db-sync-lzbxn\" (UID: \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\") " pod="glance-kuttl-tests/glance-db-sync-lzbxn" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.587582 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-db-sync-config-data\") pod \"glance-db-sync-lzbxn\" (UID: \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\") " pod="glance-kuttl-tests/glance-db-sync-lzbxn" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.587655 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-config-data\") pod \"glance-db-sync-lzbxn\" (UID: \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\") " pod="glance-kuttl-tests/glance-db-sync-lzbxn" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.587709 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q22zx\" (UniqueName: \"kubernetes.io/projected/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-kube-api-access-q22zx\") pod \"glance-db-sync-lzbxn\" (UID: \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\") " pod="glance-kuttl-tests/glance-db-sync-lzbxn" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.594700 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-db-sync-config-data\") pod \"glance-db-sync-lzbxn\" (UID: \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\") " pod="glance-kuttl-tests/glance-db-sync-lzbxn" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.598820 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-config-data\") pod \"glance-db-sync-lzbxn\" (UID: \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\") " pod="glance-kuttl-tests/glance-db-sync-lzbxn" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.617787 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q22zx\" (UniqueName: \"kubernetes.io/projected/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-kube-api-access-q22zx\") pod \"glance-db-sync-lzbxn\" (UID: \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\") " pod="glance-kuttl-tests/glance-db-sync-lzbxn" Nov 26 23:01:31 crc kubenswrapper[5008]: I1126 23:01:31.685040 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-lzbxn" Nov 26 23:01:32 crc kubenswrapper[5008]: I1126 23:01:32.157472 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-lzbxn"] Nov 26 23:01:32 crc kubenswrapper[5008]: I1126 23:01:32.887873 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-lzbxn" event={"ID":"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b","Type":"ContainerStarted","Data":"77a29a9ec2d9af87f5c802070bc742bc9870a398a24049e3682c36aaf6026e03"} Nov 26 23:01:32 crc kubenswrapper[5008]: I1126 23:01:32.888308 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-lzbxn" event={"ID":"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b","Type":"ContainerStarted","Data":"5b21c9cb99b7d84b43be743ff48c5fe8493ce4287c0773bf61764b79329f27c1"} Nov 26 23:01:32 crc kubenswrapper[5008]: I1126 23:01:32.909761 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-lzbxn" podStartSLOduration=1.909732828 podStartE2EDuration="1.909732828s" podCreationTimestamp="2025-11-26 23:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:01:32.903883135 +0000 UTC m=+1368.316577137" watchObservedRunningTime="2025-11-26 23:01:32.909732828 +0000 UTC m=+1368.322426860" Nov 26 23:01:35 crc kubenswrapper[5008]: I1126 23:01:35.933284 5008 generic.go:334] "Generic (PLEG): container finished" podID="4c8931d9-fc26-4a78-b6a4-ec2b09c6878b" containerID="77a29a9ec2d9af87f5c802070bc742bc9870a398a24049e3682c36aaf6026e03" exitCode=0 Nov 26 23:01:35 crc kubenswrapper[5008]: I1126 23:01:35.933394 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-lzbxn" event={"ID":"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b","Type":"ContainerDied","Data":"77a29a9ec2d9af87f5c802070bc742bc9870a398a24049e3682c36aaf6026e03"} Nov 26 23:01:37 crc kubenswrapper[5008]: I1126 23:01:37.409494 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-lzbxn" Nov 26 23:01:37 crc kubenswrapper[5008]: I1126 23:01:37.482297 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-config-data\") pod \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\" (UID: \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\") " Nov 26 23:01:37 crc kubenswrapper[5008]: I1126 23:01:37.482477 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-db-sync-config-data\") pod \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\" (UID: \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\") " Nov 26 23:01:37 crc kubenswrapper[5008]: I1126 23:01:37.482532 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q22zx\" (UniqueName: \"kubernetes.io/projected/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-kube-api-access-q22zx\") pod \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\" (UID: \"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b\") " Nov 26 23:01:37 crc kubenswrapper[5008]: I1126 23:01:37.491102 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-kube-api-access-q22zx" (OuterVolumeSpecName: "kube-api-access-q22zx") pod "4c8931d9-fc26-4a78-b6a4-ec2b09c6878b" (UID: "4c8931d9-fc26-4a78-b6a4-ec2b09c6878b"). InnerVolumeSpecName "kube-api-access-q22zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:01:37 crc kubenswrapper[5008]: I1126 23:01:37.492249 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4c8931d9-fc26-4a78-b6a4-ec2b09c6878b" (UID: "4c8931d9-fc26-4a78-b6a4-ec2b09c6878b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:37 crc kubenswrapper[5008]: I1126 23:01:37.542255 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-config-data" (OuterVolumeSpecName: "config-data") pod "4c8931d9-fc26-4a78-b6a4-ec2b09c6878b" (UID: "4c8931d9-fc26-4a78-b6a4-ec2b09c6878b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:37 crc kubenswrapper[5008]: I1126 23:01:37.584040 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q22zx\" (UniqueName: \"kubernetes.io/projected/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-kube-api-access-q22zx\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:37 crc kubenswrapper[5008]: I1126 23:01:37.584085 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:37 crc kubenswrapper[5008]: I1126 23:01:37.584108 5008 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4c8931d9-fc26-4a78-b6a4-ec2b09c6878b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:37 crc kubenswrapper[5008]: I1126 23:01:37.960864 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-lzbxn" event={"ID":"4c8931d9-fc26-4a78-b6a4-ec2b09c6878b","Type":"ContainerDied","Data":"5b21c9cb99b7d84b43be743ff48c5fe8493ce4287c0773bf61764b79329f27c1"} Nov 26 23:01:37 crc kubenswrapper[5008]: I1126 23:01:37.960926 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b21c9cb99b7d84b43be743ff48c5fe8493ce4287c0773bf61764b79329f27c1" Nov 26 23:01:37 crc kubenswrapper[5008]: I1126 23:01:37.961050 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-lzbxn" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.122652 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 23:01:39 crc kubenswrapper[5008]: E1126 23:01:39.123300 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c8931d9-fc26-4a78-b6a4-ec2b09c6878b" containerName="glance-db-sync" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.123314 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c8931d9-fc26-4a78-b6a4-ec2b09c6878b" containerName="glance-db-sync" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.123474 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c8931d9-fc26-4a78-b6a4-ec2b09c6878b" containerName="glance-db-sync" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.124262 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.126283 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-tdgbt" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.127518 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.128122 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.207246 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.309499 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.309838 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9q2q\" (UniqueName: \"kubernetes.io/projected/7973aa8f-2475-4fef-8928-9ebc82c3cb10-kube-api-access-w9q2q\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.310063 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.310168 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-dev\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.310206 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7973aa8f-2475-4fef-8928-9ebc82c3cb10-scripts\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.310245 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-sys\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.310284 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7973aa8f-2475-4fef-8928-9ebc82c3cb10-logs\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.310335 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.310416 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.310485 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.310556 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-run\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.310638 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7973aa8f-2475-4fef-8928-9ebc82c3cb10-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.310690 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7973aa8f-2475-4fef-8928-9ebc82c3cb10-config-data\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.310733 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.333013 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.335308 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.352660 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.406130 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.407286 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.409902 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.412850 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.412914 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9q2q\" (UniqueName: \"kubernetes.io/projected/7973aa8f-2475-4fef-8928-9ebc82c3cb10-kube-api-access-w9q2q\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.412952 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.412998 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-dev\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.413020 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7973aa8f-2475-4fef-8928-9ebc82c3cb10-scripts\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.413041 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-sys\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.413063 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7973aa8f-2475-4fef-8928-9ebc82c3cb10-logs\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.413085 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.413114 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.413143 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.413174 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-run\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.413208 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7973aa8f-2475-4fef-8928-9ebc82c3cb10-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.413238 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7973aa8f-2475-4fef-8928-9ebc82c3cb10-config-data\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.413262 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.413387 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.413426 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.414104 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.414145 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-dev\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.414439 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-sys\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.414866 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.414876 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7973aa8f-2475-4fef-8928-9ebc82c3cb10-logs\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.414954 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.414999 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7973aa8f-2475-4fef-8928-9ebc82c3cb10-run\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.415249 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7973aa8f-2475-4fef-8928-9ebc82c3cb10-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.415614 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.426841 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7973aa8f-2475-4fef-8928-9ebc82c3cb10-scripts\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.429156 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7973aa8f-2475-4fef-8928-9ebc82c3cb10-config-data\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.451726 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.467278 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.471358 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.472714 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.478576 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9q2q\" (UniqueName: \"kubernetes.io/projected/7973aa8f-2475-4fef-8928-9ebc82c3cb10-kube-api-access-w9q2q\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.496297 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.501705 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-1\" (UID: \"7973aa8f-2475-4fef-8928-9ebc82c3cb10\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.518671 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-dev\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.518706 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/955b831b-9624-46ff-9d27-38f67da8ff96-logs\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.518735 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.518753 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-run\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.518877 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.518916 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.518953 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1705cbdd-f302-404d-9f4b-535a49355777-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519010 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519028 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519063 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519123 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/955b831b-9624-46ff-9d27-38f67da8ff96-scripts\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519142 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-run\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519171 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/955b831b-9624-46ff-9d27-38f67da8ff96-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519188 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-sys\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519253 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519287 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519310 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-dev\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519332 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519376 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1705cbdd-f302-404d-9f4b-535a49355777-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519391 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519413 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519431 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519467 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5l4w\" (UniqueName: \"kubernetes.io/projected/1705cbdd-f302-404d-9f4b-535a49355777-kube-api-access-v5l4w\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519493 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519509 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cffs4\" (UniqueName: \"kubernetes.io/projected/955b831b-9624-46ff-9d27-38f67da8ff96-kube-api-access-cffs4\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519528 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/955b831b-9624-46ff-9d27-38f67da8ff96-config-data\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519549 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-sys\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519569 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1705cbdd-f302-404d-9f4b-535a49355777-logs\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.519594 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1705cbdd-f302-404d-9f4b-535a49355777-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.620569 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.620623 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1705cbdd-f302-404d-9f4b-535a49355777-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.620650 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71b609e-9e99-4919-ac62-b2d893d40ba0-logs\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.620669 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.620737 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.620777 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.620788 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621143 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1705cbdd-f302-404d-9f4b-535a49355777-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621253 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621409 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621527 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/955b831b-9624-46ff-9d27-38f67da8ff96-scripts\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621555 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b609e-9e99-4919-ac62-b2d893d40ba0-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621580 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-run\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621600 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-sys\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621625 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71b609e-9e99-4919-ac62-b2d893d40ba0-config-data\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621645 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/955b831b-9624-46ff-9d27-38f67da8ff96-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621665 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71b609e-9e99-4919-ac62-b2d893d40ba0-scripts\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621682 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-sys\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621704 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621714 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-run\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621746 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621728 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621784 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621823 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-dev\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621869 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621918 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-dev\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621829 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-sys\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621934 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1705cbdd-f302-404d-9f4b-535a49355777-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.621978 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622021 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7zfn\" (UniqueName: \"kubernetes.io/projected/f71b609e-9e99-4919-ac62-b2d893d40ba0-kube-api-access-c7zfn\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622049 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622061 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622075 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622113 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622133 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622137 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622146 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5l4w\" (UniqueName: \"kubernetes.io/projected/1705cbdd-f302-404d-9f4b-535a49355777-kube-api-access-v5l4w\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622144 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/955b831b-9624-46ff-9d27-38f67da8ff96-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622220 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622246 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622269 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-run\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622342 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cffs4\" (UniqueName: \"kubernetes.io/projected/955b831b-9624-46ff-9d27-38f67da8ff96-kube-api-access-cffs4\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622383 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/955b831b-9624-46ff-9d27-38f67da8ff96-config-data\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622426 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-sys\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622471 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1705cbdd-f302-404d-9f4b-535a49355777-logs\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622506 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1705cbdd-f302-404d-9f4b-535a49355777-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622520 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-sys\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622543 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-dev\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622568 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-dev\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622590 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/955b831b-9624-46ff-9d27-38f67da8ff96-logs\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622635 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622668 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622691 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-run\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622726 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622729 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622753 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1705cbdd-f302-404d-9f4b-535a49355777-logs\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622777 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622811 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-dev\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622849 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-run\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622869 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/955b831b-9624-46ff-9d27-38f67da8ff96-logs\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622912 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622923 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.622999 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.623118 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.624074 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.626286 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/955b831b-9624-46ff-9d27-38f67da8ff96-scripts\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.626756 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1705cbdd-f302-404d-9f4b-535a49355777-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.627613 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1705cbdd-f302-404d-9f4b-535a49355777-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.630147 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/955b831b-9624-46ff-9d27-38f67da8ff96-config-data\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.638570 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5l4w\" (UniqueName: \"kubernetes.io/projected/1705cbdd-f302-404d-9f4b-535a49355777-kube-api-access-v5l4w\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.641315 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cffs4\" (UniqueName: \"kubernetes.io/projected/955b831b-9624-46ff-9d27-38f67da8ff96-kube-api-access-cffs4\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.647507 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.649763 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.657734 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.661821 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.675237 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.677547 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724250 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71b609e-9e99-4919-ac62-b2d893d40ba0-logs\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724326 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b609e-9e99-4919-ac62-b2d893d40ba0-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724346 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-sys\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724365 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71b609e-9e99-4919-ac62-b2d893d40ba0-config-data\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724379 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71b609e-9e99-4919-ac62-b2d893d40ba0-scripts\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724437 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7zfn\" (UniqueName: \"kubernetes.io/projected/f71b609e-9e99-4919-ac62-b2d893d40ba0-kube-api-access-c7zfn\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724457 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724491 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724505 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-run\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724558 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-dev\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724579 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724611 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724646 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.724710 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.725455 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71b609e-9e99-4919-ac62-b2d893d40ba0-logs\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.725559 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.725959 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.726520 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b609e-9e99-4919-ac62-b2d893d40ba0-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.726566 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-sys\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.735307 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.735381 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-dev\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.735432 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-run\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.735506 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.738740 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.740016 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71b609e-9e99-4919-ac62-b2d893d40ba0-scripts\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.743426 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7zfn\" (UniqueName: \"kubernetes.io/projected/f71b609e-9e99-4919-ac62-b2d893d40ba0-kube-api-access-c7zfn\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.747308 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.747980 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71b609e-9e99-4919-ac62-b2d893d40ba0-config-data\") pod \"glance-default-internal-api-1\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.834242 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:39 crc kubenswrapper[5008]: I1126 23:01:39.844788 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:40 crc kubenswrapper[5008]: I1126 23:01:40.061924 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 23:01:40 crc kubenswrapper[5008]: I1126 23:01:40.097639 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 23:01:40 crc kubenswrapper[5008]: I1126 23:01:40.223717 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 26 23:01:40 crc kubenswrapper[5008]: W1126 23:01:40.226392 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7973aa8f_2475_4fef_8928_9ebc82c3cb10.slice/crio-44ffd6aeecc6f361e5a31da39f6d1033fbdd0c7983eac2c03010d2e4f4a54326 WatchSource:0}: Error finding container 44ffd6aeecc6f361e5a31da39f6d1033fbdd0c7983eac2c03010d2e4f4a54326: Status 404 returned error can't find the container with id 44ffd6aeecc6f361e5a31da39f6d1033fbdd0c7983eac2c03010d2e4f4a54326 Nov 26 23:01:40 crc kubenswrapper[5008]: I1126 23:01:40.367427 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 23:01:40 crc kubenswrapper[5008]: I1126 23:01:40.374857 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 23:01:40 crc kubenswrapper[5008]: W1126 23:01:40.376446 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1705cbdd_f302_404d_9f4b_535a49355777.slice/crio-52cf6262921b5baba1349b526a460e9c3c7232ef52d6034da72098642cf44345 WatchSource:0}: Error finding container 52cf6262921b5baba1349b526a460e9c3c7232ef52d6034da72098642cf44345: Status 404 returned error can't find the container with id 52cf6262921b5baba1349b526a460e9c3c7232ef52d6034da72098642cf44345 Nov 26 23:01:40 crc kubenswrapper[5008]: W1126 23:01:40.383392 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf71b609e_9e99_4919_ac62_b2d893d40ba0.slice/crio-dc54a31a0f868c0f5364f33bb3748c1260473fa2580f8129e9afa46b7a2bfbc6 WatchSource:0}: Error finding container dc54a31a0f868c0f5364f33bb3748c1260473fa2580f8129e9afa46b7a2bfbc6: Status 404 returned error can't find the container with id dc54a31a0f868c0f5364f33bb3748c1260473fa2580f8129e9afa46b7a2bfbc6 Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.016848 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1705cbdd-f302-404d-9f4b-535a49355777","Type":"ContainerStarted","Data":"4731f5e1e8e9f1b276c6aa234a1839447ae9f1dd36b29d01e65647efce4af484"} Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.019825 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1705cbdd-f302-404d-9f4b-535a49355777","Type":"ContainerStarted","Data":"326f949b42bf51c98cc6be3b5f9041f85f5ea619e74c4362e7f5d00dccfa3f9c"} Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.019855 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1705cbdd-f302-404d-9f4b-535a49355777","Type":"ContainerStarted","Data":"52cf6262921b5baba1349b526a460e9c3c7232ef52d6034da72098642cf44345"} Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.022848 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"7973aa8f-2475-4fef-8928-9ebc82c3cb10","Type":"ContainerStarted","Data":"00983fd2329d745f3997848d4e7c3d8b18d5576a544f12a272f3338bdc5301dc"} Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.022909 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"7973aa8f-2475-4fef-8928-9ebc82c3cb10","Type":"ContainerStarted","Data":"595026661dd4b5c220755e3ae94268cfca8b1ad74402b02663271781259c0642"} Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.022931 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"7973aa8f-2475-4fef-8928-9ebc82c3cb10","Type":"ContainerStarted","Data":"44ffd6aeecc6f361e5a31da39f6d1033fbdd0c7983eac2c03010d2e4f4a54326"} Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.028182 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"f71b609e-9e99-4919-ac62-b2d893d40ba0","Type":"ContainerStarted","Data":"8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a"} Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.028242 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"f71b609e-9e99-4919-ac62-b2d893d40ba0","Type":"ContainerStarted","Data":"6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83"} Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.028266 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"f71b609e-9e99-4919-ac62-b2d893d40ba0","Type":"ContainerStarted","Data":"dc54a31a0f868c0f5364f33bb3748c1260473fa2580f8129e9afa46b7a2bfbc6"} Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.028335 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="f71b609e-9e99-4919-ac62-b2d893d40ba0" containerName="glance-log" containerID="cri-o://6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83" gracePeriod=30 Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.028413 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="f71b609e-9e99-4919-ac62-b2d893d40ba0" containerName="glance-httpd" containerID="cri-o://8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a" gracePeriod=30 Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.031675 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"955b831b-9624-46ff-9d27-38f67da8ff96","Type":"ContainerStarted","Data":"825ed604c3d9038894792d863cf725cf5670138f5bd356628a520460334bae06"} Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.031728 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"955b831b-9624-46ff-9d27-38f67da8ff96","Type":"ContainerStarted","Data":"8045232b0cb8607598172523ef064ffc35a3c549ef23cba8f69af565d7556fb9"} Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.031774 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"955b831b-9624-46ff-9d27-38f67da8ff96","Type":"ContainerStarted","Data":"a4a29d4a453d9fe6ea615681401ed8acff0ccca7b25b250ffe3d18b57e5b500f"} Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.046913 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.046882245 podStartE2EDuration="3.046882245s" podCreationTimestamp="2025-11-26 23:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:01:41.040512494 +0000 UTC m=+1376.453206516" watchObservedRunningTime="2025-11-26 23:01:41.046882245 +0000 UTC m=+1376.459576257" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.092704 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.092687653 podStartE2EDuration="3.092687653s" podCreationTimestamp="2025-11-26 23:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:01:41.087154719 +0000 UTC m=+1376.499848721" watchObservedRunningTime="2025-11-26 23:01:41.092687653 +0000 UTC m=+1376.505381665" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.094777 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=2.094769238 podStartE2EDuration="2.094769238s" podCreationTimestamp="2025-11-26 23:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:01:41.064936891 +0000 UTC m=+1376.477630893" watchObservedRunningTime="2025-11-26 23:01:41.094769238 +0000 UTC m=+1376.507463260" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.113846 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.113818347 podStartE2EDuration="3.113818347s" podCreationTimestamp="2025-11-26 23:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:01:41.109068017 +0000 UTC m=+1376.521762029" watchObservedRunningTime="2025-11-26 23:01:41.113818347 +0000 UTC m=+1376.526512349" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.386522 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.452793 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71b609e-9e99-4919-ac62-b2d893d40ba0-scripts\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.452872 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b609e-9e99-4919-ac62-b2d893d40ba0-httpd-run\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.452894 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-lib-modules\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.452918 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-sys\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.452989 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453010 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-etc-iscsi\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453046 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7zfn\" (UniqueName: \"kubernetes.io/projected/f71b609e-9e99-4919-ac62-b2d893d40ba0-kube-api-access-c7zfn\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453038 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453070 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453256 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71b609e-9e99-4919-ac62-b2d893d40ba0-config-data\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453164 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453260 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71b609e-9e99-4919-ac62-b2d893d40ba0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453323 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-var-locks-brick\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453400 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453478 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71b609e-9e99-4919-ac62-b2d893d40ba0-logs\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453788 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71b609e-9e99-4919-ac62-b2d893d40ba0-logs" (OuterVolumeSpecName: "logs") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453816 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-sys" (OuterVolumeSpecName: "sys") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453863 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-dev\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.453939 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-dev" (OuterVolumeSpecName: "dev") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.454066 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-run\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.454145 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-run" (OuterVolumeSpecName: "run") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.454209 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-etc-nvme\") pod \"f71b609e-9e99-4919-ac62-b2d893d40ba0\" (UID: \"f71b609e-9e99-4919-ac62-b2d893d40ba0\") " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.454285 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.454954 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.455063 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.455081 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b609e-9e99-4919-ac62-b2d893d40ba0-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.455095 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.455109 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-sys\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.455125 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.455139 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.455155 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71b609e-9e99-4919-ac62-b2d893d40ba0-logs\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.455170 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f71b609e-9e99-4919-ac62-b2d893d40ba0-dev\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.458816 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71b609e-9e99-4919-ac62-b2d893d40ba0-scripts" (OuterVolumeSpecName: "scripts") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.458856 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.458879 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f71b609e-9e99-4919-ac62-b2d893d40ba0-kube-api-access-c7zfn" (OuterVolumeSpecName: "kube-api-access-c7zfn") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "kube-api-access-c7zfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.461324 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.502828 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71b609e-9e99-4919-ac62-b2d893d40ba0-config-data" (OuterVolumeSpecName: "config-data") pod "f71b609e-9e99-4919-ac62-b2d893d40ba0" (UID: "f71b609e-9e99-4919-ac62-b2d893d40ba0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.556587 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71b609e-9e99-4919-ac62-b2d893d40ba0-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.556633 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.556644 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7zfn\" (UniqueName: \"kubernetes.io/projected/f71b609e-9e99-4919-ac62-b2d893d40ba0-kube-api-access-c7zfn\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.556654 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71b609e-9e99-4919-ac62-b2d893d40ba0-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.556667 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.570512 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.580543 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.657556 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:41 crc kubenswrapper[5008]: I1126 23:01:41.657585 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.041016 5008 generic.go:334] "Generic (PLEG): container finished" podID="f71b609e-9e99-4919-ac62-b2d893d40ba0" containerID="8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a" exitCode=143 Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.041275 5008 generic.go:334] "Generic (PLEG): container finished" podID="f71b609e-9e99-4919-ac62-b2d893d40ba0" containerID="6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83" exitCode=143 Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.041064 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.041080 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"f71b609e-9e99-4919-ac62-b2d893d40ba0","Type":"ContainerDied","Data":"8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a"} Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.041341 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"f71b609e-9e99-4919-ac62-b2d893d40ba0","Type":"ContainerDied","Data":"6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83"} Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.041357 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"f71b609e-9e99-4919-ac62-b2d893d40ba0","Type":"ContainerDied","Data":"dc54a31a0f868c0f5364f33bb3748c1260473fa2580f8129e9afa46b7a2bfbc6"} Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.041374 5008 scope.go:117] "RemoveContainer" containerID="8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.058782 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.066779 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.069914 5008 scope.go:117] "RemoveContainer" containerID="6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.083346 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 23:01:42 crc kubenswrapper[5008]: E1126 23:01:42.083871 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71b609e-9e99-4919-ac62-b2d893d40ba0" containerName="glance-log" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.083958 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71b609e-9e99-4919-ac62-b2d893d40ba0" containerName="glance-log" Nov 26 23:01:42 crc kubenswrapper[5008]: E1126 23:01:42.084077 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71b609e-9e99-4919-ac62-b2d893d40ba0" containerName="glance-httpd" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.084187 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71b609e-9e99-4919-ac62-b2d893d40ba0" containerName="glance-httpd" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.085386 5008 scope.go:117] "RemoveContainer" containerID="8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.085591 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71b609e-9e99-4919-ac62-b2d893d40ba0" containerName="glance-log" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.085723 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71b609e-9e99-4919-ac62-b2d893d40ba0" containerName="glance-httpd" Nov 26 23:01:42 crc kubenswrapper[5008]: E1126 23:01:42.085907 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a\": container with ID starting with 8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a not found: ID does not exist" containerID="8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.085941 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a"} err="failed to get container status \"8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a\": rpc error: code = NotFound desc = could not find container \"8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a\": container with ID starting with 8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a not found: ID does not exist" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.085978 5008 scope.go:117] "RemoveContainer" containerID="6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83" Nov 26 23:01:42 crc kubenswrapper[5008]: E1126 23:01:42.086322 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83\": container with ID starting with 6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83 not found: ID does not exist" containerID="6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.086347 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83"} err="failed to get container status \"6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83\": rpc error: code = NotFound desc = could not find container \"6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83\": container with ID starting with 6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83 not found: ID does not exist" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.086360 5008 scope.go:117] "RemoveContainer" containerID="8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.086708 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a"} err="failed to get container status \"8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a\": rpc error: code = NotFound desc = could not find container \"8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a\": container with ID starting with 8d9b48d20914c15be4ed1fb34f429ed1da59bb87fdbbd68c446d6529b39d463a not found: ID does not exist" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.086747 5008 scope.go:117] "RemoveContainer" containerID="6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.087249 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.089491 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83"} err="failed to get container status \"6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83\": rpc error: code = NotFound desc = could not find container \"6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83\": container with ID starting with 6e3f90834215de431ca2507f6a56379b778210ba3f24388f64419c007c291d83 not found: ID does not exist" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.102355 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164487 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164571 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-run\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164592 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164635 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-sys\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164650 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqtg9\" (UniqueName: \"kubernetes.io/projected/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-kube-api-access-mqtg9\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164671 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164687 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164703 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-scripts\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164739 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-logs\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164756 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-config-data\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164791 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164808 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164827 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-dev\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.164869 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.265807 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-dev\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266105 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266129 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266159 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-run\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266176 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266211 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-sys\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266228 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqtg9\" (UniqueName: \"kubernetes.io/projected/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-kube-api-access-mqtg9\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266247 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266276 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266290 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-scripts\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266310 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266319 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-logs\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266403 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-config-data\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266396 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-run\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266470 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.265891 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-dev\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266516 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266541 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266572 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266625 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266496 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-sys\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266737 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.266738 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.267096 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-logs\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.267138 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.275545 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-config-data\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.275841 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-scripts\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.284898 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqtg9\" (UniqueName: \"kubernetes.io/projected/97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182-kube-api-access-mqtg9\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.305772 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.316155 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-1\" (UID: \"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.412603 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:42 crc kubenswrapper[5008]: I1126 23:01:42.702834 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Nov 26 23:01:43 crc kubenswrapper[5008]: I1126 23:01:43.057753 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182","Type":"ContainerStarted","Data":"4c15fb5e3b8b972a888b7be85c38bb8c05045b8c7bcc1506d3810ab0a46bb130"} Nov 26 23:01:43 crc kubenswrapper[5008]: I1126 23:01:43.058152 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182","Type":"ContainerStarted","Data":"06d2cdf37f15cc79f3f0595a20e4b35f939988f099facf310da426d7055ec72a"} Nov 26 23:01:43 crc kubenswrapper[5008]: I1126 23:01:43.536247 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f71b609e-9e99-4919-ac62-b2d893d40ba0" path="/var/lib/kubelet/pods/f71b609e-9e99-4919-ac62-b2d893d40ba0/volumes" Nov 26 23:01:44 crc kubenswrapper[5008]: I1126 23:01:44.071847 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"97b4a01e-4b3c-4fb4-9aa1-1feb9d13e182","Type":"ContainerStarted","Data":"25a553e7cc9ca7213dc7219f763472d8b8261f108c4c8ac5444feba74bfedf07"} Nov 26 23:01:49 crc kubenswrapper[5008]: I1126 23:01:49.662414 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:49 crc kubenswrapper[5008]: I1126 23:01:49.663251 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:49 crc kubenswrapper[5008]: I1126 23:01:49.712842 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:49 crc kubenswrapper[5008]: I1126 23:01:49.730432 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:49 crc kubenswrapper[5008]: I1126 23:01:49.739417 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:49 crc kubenswrapper[5008]: I1126 23:01:49.739469 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:49 crc kubenswrapper[5008]: I1126 23:01:49.761778 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=7.761760317 podStartE2EDuration="7.761760317s" podCreationTimestamp="2025-11-26 23:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:01:44.107873836 +0000 UTC m=+1379.520567898" watchObservedRunningTime="2025-11-26 23:01:49.761760317 +0000 UTC m=+1385.174454329" Nov 26 23:01:49 crc kubenswrapper[5008]: I1126 23:01:49.795525 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:49 crc kubenswrapper[5008]: I1126 23:01:49.807812 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:49 crc kubenswrapper[5008]: I1126 23:01:49.835185 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:49 crc kubenswrapper[5008]: I1126 23:01:49.835249 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:49 crc kubenswrapper[5008]: I1126 23:01:49.875936 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:49 crc kubenswrapper[5008]: I1126 23:01:49.894215 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:50 crc kubenswrapper[5008]: I1126 23:01:50.157580 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:50 crc kubenswrapper[5008]: I1126 23:01:50.157654 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:50 crc kubenswrapper[5008]: I1126 23:01:50.157683 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:50 crc kubenswrapper[5008]: I1126 23:01:50.157706 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:50 crc kubenswrapper[5008]: I1126 23:01:50.157730 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:50 crc kubenswrapper[5008]: I1126 23:01:50.157752 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.740378 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.741821 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.744755 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.744925 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.753939 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.851205 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d204354-ae15-4620-9fc5-0f36001a18a3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6d204354-ae15-4620-9fc5-0f36001a18a3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.851307 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d204354-ae15-4620-9fc5-0f36001a18a3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6d204354-ae15-4620-9fc5-0f36001a18a3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.864318 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.885703 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.953623 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d204354-ae15-4620-9fc5-0f36001a18a3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6d204354-ae15-4620-9fc5-0f36001a18a3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.953739 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d204354-ae15-4620-9fc5-0f36001a18a3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6d204354-ae15-4620-9fc5-0f36001a18a3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.953915 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d204354-ae15-4620-9fc5-0f36001a18a3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6d204354-ae15-4620-9fc5-0f36001a18a3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.970719 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:51 crc kubenswrapper[5008]: I1126 23:01:51.983893 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d204354-ae15-4620-9fc5-0f36001a18a3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6d204354-ae15-4620-9fc5-0f36001a18a3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 23:01:52 crc kubenswrapper[5008]: I1126 23:01:52.044042 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:52 crc kubenswrapper[5008]: I1126 23:01:52.074571 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 23:01:52 crc kubenswrapper[5008]: I1126 23:01:52.090666 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:52 crc kubenswrapper[5008]: I1126 23:01:52.091307 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 26 23:01:52 crc kubenswrapper[5008]: I1126 23:01:52.178161 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 23:01:52 crc kubenswrapper[5008]: I1126 23:01:52.413414 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:52 crc kubenswrapper[5008]: I1126 23:01:52.413461 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:52 crc kubenswrapper[5008]: I1126 23:01:52.447392 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:52 crc kubenswrapper[5008]: I1126 23:01:52.467979 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:52 crc kubenswrapper[5008]: I1126 23:01:52.532420 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 26 23:01:53 crc kubenswrapper[5008]: I1126 23:01:53.198647 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="955b831b-9624-46ff-9d27-38f67da8ff96" containerName="glance-log" containerID="cri-o://8045232b0cb8607598172523ef064ffc35a3c549ef23cba8f69af565d7556fb9" gracePeriod=30 Nov 26 23:01:53 crc kubenswrapper[5008]: I1126 23:01:53.199768 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6d204354-ae15-4620-9fc5-0f36001a18a3","Type":"ContainerStarted","Data":"d81d55f3c636e05a87d866af3bee86570bae68a5b7638fe393637b4eea35e321"} Nov 26 23:01:53 crc kubenswrapper[5008]: I1126 23:01:53.199798 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6d204354-ae15-4620-9fc5-0f36001a18a3","Type":"ContainerStarted","Data":"1c6d0867a23434a9ec9970307cf95e92b328f92bd06df2bc160d16251bbe115b"} Nov 26 23:01:53 crc kubenswrapper[5008]: I1126 23:01:53.201076 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="955b831b-9624-46ff-9d27-38f67da8ff96" containerName="glance-httpd" containerID="cri-o://825ed604c3d9038894792d863cf725cf5670138f5bd356628a520460334bae06" gracePeriod=30 Nov 26 23:01:53 crc kubenswrapper[5008]: I1126 23:01:53.201692 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:53 crc kubenswrapper[5008]: I1126 23:01:53.201716 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:53 crc kubenswrapper[5008]: I1126 23:01:53.207565 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="955b831b-9624-46ff-9d27-38f67da8ff96" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.149:9292/healthcheck\": EOF" Nov 26 23:01:53 crc kubenswrapper[5008]: I1126 23:01:53.209631 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="955b831b-9624-46ff-9d27-38f67da8ff96" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.149:9292/healthcheck\": EOF" Nov 26 23:01:53 crc kubenswrapper[5008]: I1126 23:01:53.230879 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.230832308 podStartE2EDuration="2.230832308s" podCreationTimestamp="2025-11-26 23:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:01:53.221812004 +0000 UTC m=+1388.634506016" watchObservedRunningTime="2025-11-26 23:01:53.230832308 +0000 UTC m=+1388.643526330" Nov 26 23:01:54 crc kubenswrapper[5008]: I1126 23:01:54.215519 5008 generic.go:334] "Generic (PLEG): container finished" podID="6d204354-ae15-4620-9fc5-0f36001a18a3" containerID="d81d55f3c636e05a87d866af3bee86570bae68a5b7638fe393637b4eea35e321" exitCode=0 Nov 26 23:01:54 crc kubenswrapper[5008]: I1126 23:01:54.215619 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6d204354-ae15-4620-9fc5-0f36001a18a3","Type":"ContainerDied","Data":"d81d55f3c636e05a87d866af3bee86570bae68a5b7638fe393637b4eea35e321"} Nov 26 23:01:54 crc kubenswrapper[5008]: I1126 23:01:54.219843 5008 generic.go:334] "Generic (PLEG): container finished" podID="955b831b-9624-46ff-9d27-38f67da8ff96" containerID="8045232b0cb8607598172523ef064ffc35a3c549ef23cba8f69af565d7556fb9" exitCode=143 Nov 26 23:01:54 crc kubenswrapper[5008]: I1126 23:01:54.219946 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"955b831b-9624-46ff-9d27-38f67da8ff96","Type":"ContainerDied","Data":"8045232b0cb8607598172523ef064ffc35a3c549ef23cba8f69af565d7556fb9"} Nov 26 23:01:55 crc kubenswrapper[5008]: I1126 23:01:55.047225 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:55 crc kubenswrapper[5008]: I1126 23:01:55.051205 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Nov 26 23:01:55 crc kubenswrapper[5008]: I1126 23:01:55.146199 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 23:01:55 crc kubenswrapper[5008]: I1126 23:01:55.146539 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="1705cbdd-f302-404d-9f4b-535a49355777" containerName="glance-log" containerID="cri-o://326f949b42bf51c98cc6be3b5f9041f85f5ea619e74c4362e7f5d00dccfa3f9c" gracePeriod=30 Nov 26 23:01:55 crc kubenswrapper[5008]: I1126 23:01:55.146771 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="1705cbdd-f302-404d-9f4b-535a49355777" containerName="glance-httpd" containerID="cri-o://4731f5e1e8e9f1b276c6aa234a1839447ae9f1dd36b29d01e65647efce4af484" gracePeriod=30 Nov 26 23:01:55 crc kubenswrapper[5008]: I1126 23:01:55.554535 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 23:01:55 crc kubenswrapper[5008]: I1126 23:01:55.644691 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d204354-ae15-4620-9fc5-0f36001a18a3-kube-api-access\") pod \"6d204354-ae15-4620-9fc5-0f36001a18a3\" (UID: \"6d204354-ae15-4620-9fc5-0f36001a18a3\") " Nov 26 23:01:55 crc kubenswrapper[5008]: I1126 23:01:55.644778 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d204354-ae15-4620-9fc5-0f36001a18a3-kubelet-dir\") pod \"6d204354-ae15-4620-9fc5-0f36001a18a3\" (UID: \"6d204354-ae15-4620-9fc5-0f36001a18a3\") " Nov 26 23:01:55 crc kubenswrapper[5008]: I1126 23:01:55.645379 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d204354-ae15-4620-9fc5-0f36001a18a3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6d204354-ae15-4620-9fc5-0f36001a18a3" (UID: "6d204354-ae15-4620-9fc5-0f36001a18a3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:55 crc kubenswrapper[5008]: I1126 23:01:55.651291 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d204354-ae15-4620-9fc5-0f36001a18a3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6d204354-ae15-4620-9fc5-0f36001a18a3" (UID: "6d204354-ae15-4620-9fc5-0f36001a18a3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:01:55 crc kubenswrapper[5008]: I1126 23:01:55.747489 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d204354-ae15-4620-9fc5-0f36001a18a3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:55 crc kubenswrapper[5008]: I1126 23:01:55.747544 5008 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d204354-ae15-4620-9fc5-0f36001a18a3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:56 crc kubenswrapper[5008]: I1126 23:01:56.235253 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 23:01:56 crc kubenswrapper[5008]: I1126 23:01:56.235327 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6d204354-ae15-4620-9fc5-0f36001a18a3","Type":"ContainerDied","Data":"1c6d0867a23434a9ec9970307cf95e92b328f92bd06df2bc160d16251bbe115b"} Nov 26 23:01:56 crc kubenswrapper[5008]: I1126 23:01:56.236177 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c6d0867a23434a9ec9970307cf95e92b328f92bd06df2bc160d16251bbe115b" Nov 26 23:01:56 crc kubenswrapper[5008]: I1126 23:01:56.245587 5008 generic.go:334] "Generic (PLEG): container finished" podID="1705cbdd-f302-404d-9f4b-535a49355777" containerID="326f949b42bf51c98cc6be3b5f9041f85f5ea619e74c4362e7f5d00dccfa3f9c" exitCode=143 Nov 26 23:01:56 crc kubenswrapper[5008]: I1126 23:01:56.245699 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1705cbdd-f302-404d-9f4b-535a49355777","Type":"ContainerDied","Data":"326f949b42bf51c98cc6be3b5f9041f85f5ea619e74c4362e7f5d00dccfa3f9c"} Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.107082 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191043 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/955b831b-9624-46ff-9d27-38f67da8ff96-logs\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191113 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/955b831b-9624-46ff-9d27-38f67da8ff96-config-data\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191143 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/955b831b-9624-46ff-9d27-38f67da8ff96-httpd-run\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191171 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-lib-modules\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191192 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-etc-iscsi\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191214 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-sys\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191240 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-etc-nvme\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191312 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cffs4\" (UniqueName: \"kubernetes.io/projected/955b831b-9624-46ff-9d27-38f67da8ff96-kube-api-access-cffs4\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191337 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191395 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-dev\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191409 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-sys" (OuterVolumeSpecName: "sys") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191414 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191430 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191516 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-var-locks-brick\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191577 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-run\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191617 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191651 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/955b831b-9624-46ff-9d27-38f67da8ff96-scripts\") pod \"955b831b-9624-46ff-9d27-38f67da8ff96\" (UID: \"955b831b-9624-46ff-9d27-38f67da8ff96\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.191930 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-dev" (OuterVolumeSpecName: "dev") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.192057 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-run" (OuterVolumeSpecName: "run") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.192108 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.192126 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.192131 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.192170 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-sys\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.192185 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-dev\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.192214 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.192290 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955b831b-9624-46ff-9d27-38f67da8ff96-logs" (OuterVolumeSpecName: "logs") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.192495 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955b831b-9624-46ff-9d27-38f67da8ff96-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.197150 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance-cache") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.197926 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.201218 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/955b831b-9624-46ff-9d27-38f67da8ff96-kube-api-access-cffs4" (OuterVolumeSpecName: "kube-api-access-cffs4") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "kube-api-access-cffs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.202099 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955b831b-9624-46ff-9d27-38f67da8ff96-scripts" (OuterVolumeSpecName: "scripts") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.244716 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955b831b-9624-46ff-9d27-38f67da8ff96-config-data" (OuterVolumeSpecName: "config-data") pod "955b831b-9624-46ff-9d27-38f67da8ff96" (UID: "955b831b-9624-46ff-9d27-38f67da8ff96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.266482 5008 generic.go:334] "Generic (PLEG): container finished" podID="955b831b-9624-46ff-9d27-38f67da8ff96" containerID="825ed604c3d9038894792d863cf725cf5670138f5bd356628a520460334bae06" exitCode=0 Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.266538 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.266548 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"955b831b-9624-46ff-9d27-38f67da8ff96","Type":"ContainerDied","Data":"825ed604c3d9038894792d863cf725cf5670138f5bd356628a520460334bae06"} Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.266600 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"955b831b-9624-46ff-9d27-38f67da8ff96","Type":"ContainerDied","Data":"a4a29d4a453d9fe6ea615681401ed8acff0ccca7b25b250ffe3d18b57e5b500f"} Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.266630 5008 scope.go:117] "RemoveContainer" containerID="825ed604c3d9038894792d863cf725cf5670138f5bd356628a520460334bae06" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.297767 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.297914 5008 scope.go:117] "RemoveContainer" containerID="8045232b0cb8607598172523ef064ffc35a3c549ef23cba8f69af565d7556fb9" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.300332 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/955b831b-9624-46ff-9d27-38f67da8ff96-logs\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.300375 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/955b831b-9624-46ff-9d27-38f67da8ff96-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.300398 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/955b831b-9624-46ff-9d27-38f67da8ff96-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.300418 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.300435 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cffs4\" (UniqueName: \"kubernetes.io/projected/955b831b-9624-46ff-9d27-38f67da8ff96-kube-api-access-cffs4\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.300474 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.300494 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.300511 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/955b831b-9624-46ff-9d27-38f67da8ff96-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.300536 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.300555 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/955b831b-9624-46ff-9d27-38f67da8ff96-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.303874 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.313584 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.320449 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.323554 5008 scope.go:117] "RemoveContainer" containerID="825ed604c3d9038894792d863cf725cf5670138f5bd356628a520460334bae06" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.335194 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 23:01:58 crc kubenswrapper[5008]: E1126 23:01:58.335481 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="955b831b-9624-46ff-9d27-38f67da8ff96" containerName="glance-log" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.335493 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="955b831b-9624-46ff-9d27-38f67da8ff96" containerName="glance-log" Nov 26 23:01:58 crc kubenswrapper[5008]: E1126 23:01:58.335509 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="955b831b-9624-46ff-9d27-38f67da8ff96" containerName="glance-httpd" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.335515 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="955b831b-9624-46ff-9d27-38f67da8ff96" containerName="glance-httpd" Nov 26 23:01:58 crc kubenswrapper[5008]: E1126 23:01:58.335627 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"825ed604c3d9038894792d863cf725cf5670138f5bd356628a520460334bae06\": container with ID starting with 825ed604c3d9038894792d863cf725cf5670138f5bd356628a520460334bae06 not found: ID does not exist" containerID="825ed604c3d9038894792d863cf725cf5670138f5bd356628a520460334bae06" Nov 26 23:01:58 crc kubenswrapper[5008]: E1126 23:01:58.335525 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d204354-ae15-4620-9fc5-0f36001a18a3" containerName="pruner" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.336810 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d204354-ae15-4620-9fc5-0f36001a18a3" containerName="pruner" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.337159 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d204354-ae15-4620-9fc5-0f36001a18a3" containerName="pruner" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.337183 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="955b831b-9624-46ff-9d27-38f67da8ff96" containerName="glance-log" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.337195 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="955b831b-9624-46ff-9d27-38f67da8ff96" containerName="glance-httpd" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.338226 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.335655 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"825ed604c3d9038894792d863cf725cf5670138f5bd356628a520460334bae06"} err="failed to get container status \"825ed604c3d9038894792d863cf725cf5670138f5bd356628a520460334bae06\": rpc error: code = NotFound desc = could not find container \"825ed604c3d9038894792d863cf725cf5670138f5bd356628a520460334bae06\": container with ID starting with 825ed604c3d9038894792d863cf725cf5670138f5bd356628a520460334bae06 not found: ID does not exist" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.338702 5008 scope.go:117] "RemoveContainer" containerID="8045232b0cb8607598172523ef064ffc35a3c549ef23cba8f69af565d7556fb9" Nov 26 23:01:58 crc kubenswrapper[5008]: E1126 23:01:58.339583 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8045232b0cb8607598172523ef064ffc35a3c549ef23cba8f69af565d7556fb9\": container with ID starting with 8045232b0cb8607598172523ef064ffc35a3c549ef23cba8f69af565d7556fb9 not found: ID does not exist" containerID="8045232b0cb8607598172523ef064ffc35a3c549ef23cba8f69af565d7556fb9" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.339631 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8045232b0cb8607598172523ef064ffc35a3c549ef23cba8f69af565d7556fb9"} err="failed to get container status \"8045232b0cb8607598172523ef064ffc35a3c549ef23cba8f69af565d7556fb9\": rpc error: code = NotFound desc = could not find container \"8045232b0cb8607598172523ef064ffc35a3c549ef23cba8f69af565d7556fb9\": container with ID starting with 8045232b0cb8607598172523ef064ffc35a3c549ef23cba8f69af565d7556fb9 not found: ID does not exist" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.351304 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.402146 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46231fae-475b-4b4c-ac47-c28c899a2403-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.402715 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46231fae-475b-4b4c-ac47-c28c899a2403-logs\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.402740 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-sys\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.402776 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.402804 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-run\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.402839 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46231fae-475b-4b4c-ac47-c28c899a2403-config-data\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.402866 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.402910 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.402940 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-442w8\" (UniqueName: \"kubernetes.io/projected/46231fae-475b-4b4c-ac47-c28c899a2403-kube-api-access-442w8\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.402980 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.403259 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.403353 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.403402 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-dev\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.403436 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.403508 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.403716 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46231fae-475b-4b4c-ac47-c28c899a2403-scripts\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.424406 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.429057 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505293 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46231fae-475b-4b4c-ac47-c28c899a2403-config-data\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505356 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505380 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-442w8\" (UniqueName: \"kubernetes.io/projected/46231fae-475b-4b4c-ac47-c28c899a2403-kube-api-access-442w8\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505405 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505427 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505443 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-dev\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505461 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46231fae-475b-4b4c-ac47-c28c899a2403-scripts\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505485 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505507 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46231fae-475b-4b4c-ac47-c28c899a2403-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505543 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-dev\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505548 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-sys\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505582 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-sys\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505597 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46231fae-475b-4b4c-ac47-c28c899a2403-logs\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505692 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505691 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505726 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-run\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505755 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-run\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505847 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.505921 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/46231fae-475b-4b4c-ac47-c28c899a2403-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.506648 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46231fae-475b-4b4c-ac47-c28c899a2403-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.506732 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46231fae-475b-4b4c-ac47-c28c899a2403-logs\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.509654 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46231fae-475b-4b4c-ac47-c28c899a2403-scripts\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.519584 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46231fae-475b-4b4c-ac47-c28c899a2403-config-data\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.522120 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-442w8\" (UniqueName: \"kubernetes.io/projected/46231fae-475b-4b4c-ac47-c28c899a2403-kube-api-access-442w8\") pod \"glance-default-external-api-0\" (UID: \"46231fae-475b-4b4c-ac47-c28c899a2403\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.576349 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.607393 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-run\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.607480 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-dev\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.607572 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1705cbdd-f302-404d-9f4b-535a49355777-httpd-run\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.607638 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1705cbdd-f302-404d-9f4b-535a49355777-scripts\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.607705 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-lib-modules\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.607777 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-var-locks-brick\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.608086 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.608171 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.608119 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-run" (OuterVolumeSpecName: "run") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.608143 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.608162 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-dev" (OuterVolumeSpecName: "dev") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.608244 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-etc-iscsi\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.608291 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1705cbdd-f302-404d-9f4b-535a49355777-logs\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.608353 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5l4w\" (UniqueName: \"kubernetes.io/projected/1705cbdd-f302-404d-9f4b-535a49355777-kube-api-access-v5l4w\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.608399 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1705cbdd-f302-404d-9f4b-535a49355777-config-data\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.608454 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.608500 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-sys\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.608550 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-etc-nvme\") pod \"1705cbdd-f302-404d-9f4b-535a49355777\" (UID: \"1705cbdd-f302-404d-9f4b-535a49355777\") " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.608846 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1705cbdd-f302-404d-9f4b-535a49355777-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.609152 5008 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.609180 5008 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-dev\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.609199 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1705cbdd-f302-404d-9f4b-535a49355777-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.609217 5008 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.609236 5008 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.609603 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.610152 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1705cbdd-f302-404d-9f4b-535a49355777-logs" (OuterVolumeSpecName: "logs") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.610250 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-sys" (OuterVolumeSpecName: "sys") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.610737 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.611348 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.611672 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1705cbdd-f302-404d-9f4b-535a49355777-kube-api-access-v5l4w" (OuterVolumeSpecName: "kube-api-access-v5l4w") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "kube-api-access-v5l4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.612575 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1705cbdd-f302-404d-9f4b-535a49355777-scripts" (OuterVolumeSpecName: "scripts") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.615359 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.662168 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1705cbdd-f302-404d-9f4b-535a49355777-config-data" (OuterVolumeSpecName: "config-data") pod "1705cbdd-f302-404d-9f4b-535a49355777" (UID: "1705cbdd-f302-404d-9f4b-535a49355777"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.685400 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.711532 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1705cbdd-f302-404d-9f4b-535a49355777-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.719102 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.719183 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1705cbdd-f302-404d-9f4b-535a49355777-logs\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.719200 5008 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.719214 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5l4w\" (UniqueName: \"kubernetes.io/projected/1705cbdd-f302-404d-9f4b-535a49355777-kube-api-access-v5l4w\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.719230 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1705cbdd-f302-404d-9f4b-535a49355777-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.719270 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.719284 5008 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-sys\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.719296 5008 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1705cbdd-f302-404d-9f4b-535a49355777-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.732600 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.738554 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.820361 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.820648 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Nov 26 23:01:58 crc kubenswrapper[5008]: I1126 23:01:58.903741 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.131053 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 26 23:01:59 crc kubenswrapper[5008]: E1126 23:01:59.131452 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1705cbdd-f302-404d-9f4b-535a49355777" containerName="glance-log" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.131472 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1705cbdd-f302-404d-9f4b-535a49355777" containerName="glance-log" Nov 26 23:01:59 crc kubenswrapper[5008]: E1126 23:01:59.131505 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1705cbdd-f302-404d-9f4b-535a49355777" containerName="glance-httpd" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.131514 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1705cbdd-f302-404d-9f4b-535a49355777" containerName="glance-httpd" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.131682 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1705cbdd-f302-404d-9f4b-535a49355777" containerName="glance-httpd" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.131706 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1705cbdd-f302-404d-9f4b-535a49355777" containerName="glance-log" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.132266 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.137669 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.138225 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.159037 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.230150 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-kube-api-access\") pod \"installer-9-crc\" (UID: \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.230401 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.230506 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-var-lock\") pod \"installer-9-crc\" (UID: \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.276817 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"46231fae-475b-4b4c-ac47-c28c899a2403","Type":"ContainerStarted","Data":"d6e62c46da85e4714344596f6fd72c58851665c7a25bea2df57c26bc23de6b41"} Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.276878 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"46231fae-475b-4b4c-ac47-c28c899a2403","Type":"ContainerStarted","Data":"bb865a657e12cb679121be10c2a0d64fbd42dbcb430f361f14251cd3d4fdd296"} Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.276901 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"46231fae-475b-4b4c-ac47-c28c899a2403","Type":"ContainerStarted","Data":"5a989d8eb0e05165c3499142e16f7aa3f16dd386db00d470266685152cf3ce76"} Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.284017 5008 generic.go:334] "Generic (PLEG): container finished" podID="1705cbdd-f302-404d-9f4b-535a49355777" containerID="4731f5e1e8e9f1b276c6aa234a1839447ae9f1dd36b29d01e65647efce4af484" exitCode=0 Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.284079 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1705cbdd-f302-404d-9f4b-535a49355777","Type":"ContainerDied","Data":"4731f5e1e8e9f1b276c6aa234a1839447ae9f1dd36b29d01e65647efce4af484"} Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.284161 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1705cbdd-f302-404d-9f4b-535a49355777","Type":"ContainerDied","Data":"52cf6262921b5baba1349b526a460e9c3c7232ef52d6034da72098642cf44345"} Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.284195 5008 scope.go:117] "RemoveContainer" containerID="4731f5e1e8e9f1b276c6aa234a1839447ae9f1dd36b29d01e65647efce4af484" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.284084 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.323510 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=1.3234776400000001 podStartE2EDuration="1.32347764s" podCreationTimestamp="2025-11-26 23:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:01:59.300292473 +0000 UTC m=+1394.712986505" watchObservedRunningTime="2025-11-26 23:01:59.32347764 +0000 UTC m=+1394.736171672" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.332492 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.332548 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-var-lock\") pod \"installer-9-crc\" (UID: \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.332663 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-kube-api-access\") pod \"installer-9-crc\" (UID: \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.333053 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.333574 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-var-lock\") pod \"installer-9-crc\" (UID: \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.345950 5008 scope.go:117] "RemoveContainer" containerID="326f949b42bf51c98cc6be3b5f9041f85f5ea619e74c4362e7f5d00dccfa3f9c" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.352094 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.362014 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.365988 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-kube-api-access\") pod \"installer-9-crc\" (UID: \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.373375 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.374781 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.376811 5008 scope.go:117] "RemoveContainer" containerID="4731f5e1e8e9f1b276c6aa234a1839447ae9f1dd36b29d01e65647efce4af484" Nov 26 23:01:59 crc kubenswrapper[5008]: E1126 23:01:59.377465 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4731f5e1e8e9f1b276c6aa234a1839447ae9f1dd36b29d01e65647efce4af484\": container with ID starting with 4731f5e1e8e9f1b276c6aa234a1839447ae9f1dd36b29d01e65647efce4af484 not found: ID does not exist" containerID="4731f5e1e8e9f1b276c6aa234a1839447ae9f1dd36b29d01e65647efce4af484" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.377513 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4731f5e1e8e9f1b276c6aa234a1839447ae9f1dd36b29d01e65647efce4af484"} err="failed to get container status \"4731f5e1e8e9f1b276c6aa234a1839447ae9f1dd36b29d01e65647efce4af484\": rpc error: code = NotFound desc = could not find container \"4731f5e1e8e9f1b276c6aa234a1839447ae9f1dd36b29d01e65647efce4af484\": container with ID starting with 4731f5e1e8e9f1b276c6aa234a1839447ae9f1dd36b29d01e65647efce4af484 not found: ID does not exist" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.377543 5008 scope.go:117] "RemoveContainer" containerID="326f949b42bf51c98cc6be3b5f9041f85f5ea619e74c4362e7f5d00dccfa3f9c" Nov 26 23:01:59 crc kubenswrapper[5008]: E1126 23:01:59.378210 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"326f949b42bf51c98cc6be3b5f9041f85f5ea619e74c4362e7f5d00dccfa3f9c\": container with ID starting with 326f949b42bf51c98cc6be3b5f9041f85f5ea619e74c4362e7f5d00dccfa3f9c not found: ID does not exist" containerID="326f949b42bf51c98cc6be3b5f9041f85f5ea619e74c4362e7f5d00dccfa3f9c" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.378240 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326f949b42bf51c98cc6be3b5f9041f85f5ea619e74c4362e7f5d00dccfa3f9c"} err="failed to get container status \"326f949b42bf51c98cc6be3b5f9041f85f5ea619e74c4362e7f5d00dccfa3f9c\": rpc error: code = NotFound desc = could not find container \"326f949b42bf51c98cc6be3b5f9041f85f5ea619e74c4362e7f5d00dccfa3f9c\": container with ID starting with 326f949b42bf51c98cc6be3b5f9041f85f5ea619e74c4362e7f5d00dccfa3f9c not found: ID does not exist" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.387145 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.493323 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.540936 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.541012 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.541034 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.541054 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-sys\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.541079 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm2lf\" (UniqueName: \"kubernetes.io/projected/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-kube-api-access-rm2lf\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.541123 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.541153 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.541175 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.541209 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.541236 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.541254 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.541281 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-dev\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.541313 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.541331 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-run\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.550713 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1705cbdd-f302-404d-9f4b-535a49355777" path="/var/lib/kubelet/pods/1705cbdd-f302-404d-9f4b-535a49355777/volumes" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.551631 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="955b831b-9624-46ff-9d27-38f67da8ff96" path="/var/lib/kubelet/pods/955b831b-9624-46ff-9d27-38f67da8ff96/volumes" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.642827 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643150 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643173 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643206 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-dev\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643200 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643250 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643273 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-run\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643332 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643332 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-dev\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643364 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643385 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643407 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-sys\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643409 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643434 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm2lf\" (UniqueName: \"kubernetes.io/projected/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-kube-api-access-rm2lf\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643484 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643514 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.643548 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.644013 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-sys\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.644212 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.644229 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.644592 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-run\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.644715 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.644735 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.644772 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.649108 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.654702 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.655719 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.673009 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm2lf\" (UniqueName: \"kubernetes.io/projected/7d02b76c-5d90-4e0c-8cd0-859b71ff342d-kube-api-access-rm2lf\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.677934 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.679132 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d02b76c-5d90-4e0c-8cd0-859b71ff342d\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.717330 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:01:59 crc kubenswrapper[5008]: I1126 23:01:59.981790 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 23:01:59 crc kubenswrapper[5008]: W1126 23:01:59.987021 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d02b76c_5d90_4e0c_8cd0_859b71ff342d.slice/crio-0d691ffb80a1ff6fb4367fb3b07d4f82ab5f4678c474fc6d297b720e531b7ba5 WatchSource:0}: Error finding container 0d691ffb80a1ff6fb4367fb3b07d4f82ab5f4678c474fc6d297b720e531b7ba5: Status 404 returned error can't find the container with id 0d691ffb80a1ff6fb4367fb3b07d4f82ab5f4678c474fc6d297b720e531b7ba5 Nov 26 23:02:00 crc kubenswrapper[5008]: W1126 23:02:00.038569 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1308c6b2_bbac_4fa2_a8cb_7a3028f05ae5.slice/crio-920e37d3564c3756f79d820501d2db4fad657a113058c1f42f8bf9d07303bbe1 WatchSource:0}: Error finding container 920e37d3564c3756f79d820501d2db4fad657a113058c1f42f8bf9d07303bbe1: Status 404 returned error can't find the container with id 920e37d3564c3756f79d820501d2db4fad657a113058c1f42f8bf9d07303bbe1 Nov 26 23:02:00 crc kubenswrapper[5008]: I1126 23:02:00.041213 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 26 23:02:00 crc kubenswrapper[5008]: I1126 23:02:00.296393 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7d02b76c-5d90-4e0c-8cd0-859b71ff342d","Type":"ContainerStarted","Data":"a6074ff47a8693ce12babc4033ed8f5ba8e5b1ad3b3190048722dd386980b932"} Nov 26 23:02:00 crc kubenswrapper[5008]: I1126 23:02:00.296704 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7d02b76c-5d90-4e0c-8cd0-859b71ff342d","Type":"ContainerStarted","Data":"0d691ffb80a1ff6fb4367fb3b07d4f82ab5f4678c474fc6d297b720e531b7ba5"} Nov 26 23:02:00 crc kubenswrapper[5008]: I1126 23:02:00.297807 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5","Type":"ContainerStarted","Data":"920e37d3564c3756f79d820501d2db4fad657a113058c1f42f8bf9d07303bbe1"} Nov 26 23:02:01 crc kubenswrapper[5008]: I1126 23:02:01.312485 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7d02b76c-5d90-4e0c-8cd0-859b71ff342d","Type":"ContainerStarted","Data":"c48ac8742827899ce32d26ee2465c570851b25d3fc589164fedc8163fa19078a"} Nov 26 23:02:01 crc kubenswrapper[5008]: I1126 23:02:01.315010 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5","Type":"ContainerStarted","Data":"958d6619b9c22bce1e3d357a4c2a7ae4d61b22dd626e15c5e32c1dfd6d26d888"} Nov 26 23:02:01 crc kubenswrapper[5008]: I1126 23:02:01.358433 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.358415812 podStartE2EDuration="2.358415812s" podCreationTimestamp="2025-11-26 23:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:02:01.350737801 +0000 UTC m=+1396.763431843" watchObservedRunningTime="2025-11-26 23:02:01.358415812 +0000 UTC m=+1396.771109814" Nov 26 23:02:01 crc kubenswrapper[5008]: I1126 23:02:01.376203 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.37617227 podStartE2EDuration="2.37617227s" podCreationTimestamp="2025-11-26 23:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:02:01.373901429 +0000 UTC m=+1396.786595481" watchObservedRunningTime="2025-11-26 23:02:01.37617227 +0000 UTC m=+1396.788866272" Nov 26 23:02:08 crc kubenswrapper[5008]: I1126 23:02:08.686189 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:02:08 crc kubenswrapper[5008]: I1126 23:02:08.687068 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:02:08 crc kubenswrapper[5008]: I1126 23:02:08.734932 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:02:08 crc kubenswrapper[5008]: I1126 23:02:08.777467 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:02:09 crc kubenswrapper[5008]: I1126 23:02:09.399041 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:02:09 crc kubenswrapper[5008]: I1126 23:02:09.399127 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:02:09 crc kubenswrapper[5008]: I1126 23:02:09.717729 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:02:09 crc kubenswrapper[5008]: I1126 23:02:09.717812 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:02:09 crc kubenswrapper[5008]: I1126 23:02:09.763929 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:02:09 crc kubenswrapper[5008]: I1126 23:02:09.789918 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:02:10 crc kubenswrapper[5008]: I1126 23:02:10.408024 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:02:10 crc kubenswrapper[5008]: I1126 23:02:10.408316 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:02:11 crc kubenswrapper[5008]: I1126 23:02:11.225808 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:02:11 crc kubenswrapper[5008]: I1126 23:02:11.229725 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 23:02:12 crc kubenswrapper[5008]: I1126 23:02:12.290972 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:02:12 crc kubenswrapper[5008]: I1126 23:02:12.336492 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.547185 5008 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.549517 5008 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.549731 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.550047 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a" gracePeriod=15 Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.550117 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f" gracePeriod=15 Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.550272 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c" gracePeriod=15 Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.550306 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801" gracePeriod=15 Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.550272 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e" gracePeriod=15 Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.550400 5008 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 23:02:38 crc kubenswrapper[5008]: E1126 23:02:38.551027 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.551081 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 26 23:02:38 crc kubenswrapper[5008]: E1126 23:02:38.551130 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.551148 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 26 23:02:38 crc kubenswrapper[5008]: E1126 23:02:38.551171 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.551188 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 26 23:02:38 crc kubenswrapper[5008]: E1126 23:02:38.551213 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.551229 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 23:02:38 crc kubenswrapper[5008]: E1126 23:02:38.551258 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.551275 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 23:02:38 crc kubenswrapper[5008]: E1126 23:02:38.551294 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.551311 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 26 23:02:38 crc kubenswrapper[5008]: E1126 23:02:38.551338 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.551355 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.551644 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.551663 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.551685 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.551706 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.551721 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.551739 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.558642 5008 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.597484 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.597546 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.597580 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.597641 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.597664 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.597763 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.597813 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.597841 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.616551 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.699618 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700006 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.699804 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700060 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700130 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700136 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700150 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700205 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700251 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700341 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700568 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700655 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700706 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700832 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700881 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.700920 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.729592 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.731370 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.732116 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c" exitCode=0 Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.732139 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f" exitCode=0 Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.732147 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e" exitCode=0 Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.732154 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801" exitCode=2 Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.732190 5008 scope.go:117] "RemoveContainer" containerID="b3739554bfcd085238d881be1254902e9748e16c2260dc1f247111655253045a" Nov 26 23:02:38 crc kubenswrapper[5008]: I1126 23:02:38.905906 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:02:38 crc kubenswrapper[5008]: E1126 23:02:38.941494 5008 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.166:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187bb0e0e10e1ab7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 23:02:38.940641975 +0000 UTC m=+1434.353335987,LastTimestamp:2025-11-26 23:02:38.940641975 +0000 UTC m=+1434.353335987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 23:02:39 crc kubenswrapper[5008]: I1126 23:02:39.750299 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 23:02:39 crc kubenswrapper[5008]: I1126 23:02:39.754772 5008 generic.go:334] "Generic (PLEG): container finished" podID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" containerID="958d6619b9c22bce1e3d357a4c2a7ae4d61b22dd626e15c5e32c1dfd6d26d888" exitCode=0 Nov 26 23:02:39 crc kubenswrapper[5008]: I1126 23:02:39.754885 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5","Type":"ContainerDied","Data":"958d6619b9c22bce1e3d357a4c2a7ae4d61b22dd626e15c5e32c1dfd6d26d888"} Nov 26 23:02:39 crc kubenswrapper[5008]: I1126 23:02:39.756250 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:39 crc kubenswrapper[5008]: I1126 23:02:39.757033 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:39 crc kubenswrapper[5008]: I1126 23:02:39.757286 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"91f8c3ab3f4a5f7b54e77b3e6755816591a0a58f4d462a0cb257a42534c04220"} Nov 26 23:02:39 crc kubenswrapper[5008]: I1126 23:02:39.757349 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7777560f17ef692575ab97496cd86c4055bf6b16c70656468602d67776c7c94c"} Nov 26 23:02:39 crc kubenswrapper[5008]: I1126 23:02:39.758392 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:39 crc kubenswrapper[5008]: I1126 23:02:39.759283 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.056143 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.057199 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.057625 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.068666 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.069849 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.070299 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.071007 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.071640 5008 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.141746 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-var-lock\") pod \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\" (UID: \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\") " Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.142211 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.141875 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-var-lock" (OuterVolumeSpecName: "var-lock") pod "1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" (UID: "1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.142338 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-kubelet-dir\") pod \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\" (UID: \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\") " Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.142393 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.142405 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.142434 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" (UID: "1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.142483 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.142520 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-kube-api-access\") pod \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\" (UID: \"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5\") " Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.142541 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.142617 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.143095 5008 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.143120 5008 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.143139 5008 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.143155 5008 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-var-lock\") on node \"crc\" DevicePath \"\"" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.143172 5008 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.151621 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" (UID: "1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.245063 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.532304 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.777915 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5","Type":"ContainerDied","Data":"920e37d3564c3756f79d820501d2db4fad657a113058c1f42f8bf9d07303bbe1"} Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.778028 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="920e37d3564c3756f79d820501d2db4fad657a113058c1f42f8bf9d07303bbe1" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.778103 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.782076 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.784750 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a" exitCode=0 Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.784818 5008 scope.go:117] "RemoveContainer" containerID="3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.784891 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.786782 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.789771 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.790495 5008 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.790873 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.791297 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.791757 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.792292 5008 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.792533 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.815240 5008 scope.go:117] "RemoveContainer" containerID="db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.836592 5008 scope.go:117] "RemoveContainer" containerID="e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.860709 5008 scope.go:117] "RemoveContainer" containerID="7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.892871 5008 scope.go:117] "RemoveContainer" containerID="9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.917885 5008 scope.go:117] "RemoveContainer" containerID="7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.954616 5008 scope.go:117] "RemoveContainer" containerID="3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c" Nov 26 23:02:41 crc kubenswrapper[5008]: E1126 23:02:41.955617 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\": container with ID starting with 3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c not found: ID does not exist" containerID="3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.955663 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c"} err="failed to get container status \"3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\": rpc error: code = NotFound desc = could not find container \"3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c\": container with ID starting with 3703a133e4201463b79fcd02d0fe178265d5d4273473019ee3495fbf03cad14c not found: ID does not exist" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.955788 5008 scope.go:117] "RemoveContainer" containerID="db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f" Nov 26 23:02:41 crc kubenswrapper[5008]: E1126 23:02:41.956245 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\": container with ID starting with db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f not found: ID does not exist" containerID="db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.956298 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f"} err="failed to get container status \"db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\": rpc error: code = NotFound desc = could not find container \"db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f\": container with ID starting with db00e90da0eb7e0addd70108208e534f60110283c75012bffbc8b92111c3958f not found: ID does not exist" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.956338 5008 scope.go:117] "RemoveContainer" containerID="e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e" Nov 26 23:02:41 crc kubenswrapper[5008]: E1126 23:02:41.956645 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\": container with ID starting with e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e not found: ID does not exist" containerID="e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.956686 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e"} err="failed to get container status \"e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\": rpc error: code = NotFound desc = could not find container \"e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e\": container with ID starting with e17ca2f789ce5cf94250bbc148e28a94ad7e0a392b9c0840f8146d348211a20e not found: ID does not exist" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.956741 5008 scope.go:117] "RemoveContainer" containerID="7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801" Nov 26 23:02:41 crc kubenswrapper[5008]: E1126 23:02:41.957695 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\": container with ID starting with 7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801 not found: ID does not exist" containerID="7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.957718 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801"} err="failed to get container status \"7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\": rpc error: code = NotFound desc = could not find container \"7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801\": container with ID starting with 7564adde90843a13f68f4a6b842beabd6a96643cebc018f6e30f5dfe049ef801 not found: ID does not exist" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.957732 5008 scope.go:117] "RemoveContainer" containerID="9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a" Nov 26 23:02:41 crc kubenswrapper[5008]: E1126 23:02:41.958180 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\": container with ID starting with 9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a not found: ID does not exist" containerID="9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.958229 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a"} err="failed to get container status \"9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\": rpc error: code = NotFound desc = could not find container \"9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a\": container with ID starting with 9b93d00332e9e0d262e61e67e4d14b575bb8c0d48286c2b38a4ffadf65260d0a not found: ID does not exist" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.958258 5008 scope.go:117] "RemoveContainer" containerID="7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6" Nov 26 23:02:41 crc kubenswrapper[5008]: E1126 23:02:41.958676 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\": container with ID starting with 7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6 not found: ID does not exist" containerID="7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6" Nov 26 23:02:41 crc kubenswrapper[5008]: I1126 23:02:41.958698 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6"} err="failed to get container status \"7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\": rpc error: code = NotFound desc = could not find container \"7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6\": container with ID starting with 7c7fa3e175877aaeb56efc38780ddbbf1e214b09234ea9b13aedea66ffe0c1b6 not found: ID does not exist" Nov 26 23:02:42 crc kubenswrapper[5008]: E1126 23:02:42.552449 5008 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC glance-kuttl-tests/mysql-db-openstack-galera-1: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/glance-kuttl-tests/persistentvolumeclaims/mysql-db-openstack-galera-1\": dial tcp 38.102.83.166:6443: connect: connection refused" pod="glance-kuttl-tests/openstack-galera-1" volumeName="mysql-db" Nov 26 23:02:43 crc kubenswrapper[5008]: E1126 23:02:43.031709 5008 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:43 crc kubenswrapper[5008]: E1126 23:02:43.032591 5008 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:43 crc kubenswrapper[5008]: E1126 23:02:43.033319 5008 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:43 crc kubenswrapper[5008]: E1126 23:02:43.033840 5008 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:43 crc kubenswrapper[5008]: E1126 23:02:43.034574 5008 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:43 crc kubenswrapper[5008]: I1126 23:02:43.034644 5008 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 26 23:02:43 crc kubenswrapper[5008]: E1126 23:02:43.035185 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="200ms" Nov 26 23:02:43 crc kubenswrapper[5008]: E1126 23:02:43.235743 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="400ms" Nov 26 23:02:43 crc kubenswrapper[5008]: E1126 23:02:43.637629 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="800ms" Nov 26 23:02:44 crc kubenswrapper[5008]: E1126 23:02:44.438822 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="1.6s" Nov 26 23:02:44 crc kubenswrapper[5008]: E1126 23:02:44.646942 5008 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.166:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187bb0e0e10e1ab7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 23:02:38.940641975 +0000 UTC m=+1434.353335987,LastTimestamp:2025-11-26 23:02:38.940641975 +0000 UTC m=+1434.353335987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 23:02:45 crc kubenswrapper[5008]: I1126 23:02:45.540617 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:45 crc kubenswrapper[5008]: I1126 23:02:45.543318 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:45 crc kubenswrapper[5008]: E1126 23:02:45.605194 5008 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC glance-kuttl-tests/mysql-db-openstack-galera-2: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/glance-kuttl-tests/persistentvolumeclaims/mysql-db-openstack-galera-2\": dial tcp 38.102.83.166:6443: connect: connection refused" pod="glance-kuttl-tests/openstack-galera-2" volumeName="mysql-db" Nov 26 23:02:46 crc kubenswrapper[5008]: E1126 23:02:46.039908 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="3.2s" Nov 26 23:02:49 crc kubenswrapper[5008]: E1126 23:02:49.240720 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="6.4s" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.685677 5008 scope.go:117] "RemoveContainer" containerID="36f217132e067758614f2904e77682580bca0cacc4d32c857da8ec7c2f264f16" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.844275 5008 scope.go:117] "RemoveContainer" containerID="438c620210d1abe035f72ec94d1312307d1298390e190607a64c3946021e9f0d" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.889833 5008 generic.go:334] "Generic (PLEG): container finished" podID="0280585a-1314-4eac-9fc6-d83aa687a4f4" containerID="8259b0b4910f9a2fe20ae8be5de9bfc9596aa13c2cf2425ec7d1c1c2daa0662c" exitCode=1 Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.889899 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" event={"ID":"0280585a-1314-4eac-9fc6-d83aa687a4f4","Type":"ContainerDied","Data":"8259b0b4910f9a2fe20ae8be5de9bfc9596aa13c2cf2425ec7d1c1c2daa0662c"} Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.891054 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.891082 5008 scope.go:117] "RemoveContainer" containerID="8259b0b4910f9a2fe20ae8be5de9bfc9596aa13c2cf2425ec7d1c1c2daa0662c" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.891647 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.892204 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.893236 5008 generic.go:334] "Generic (PLEG): container finished" podID="d0c58bd4-2da5-4770-b562-aad453776b10" containerID="bc05ad5a81c3457be482457c1712709f741ccb8e7683be16e0410e59c465646c" exitCode=1 Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.893343 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" event={"ID":"d0c58bd4-2da5-4770-b562-aad453776b10","Type":"ContainerDied","Data":"bc05ad5a81c3457be482457c1712709f741ccb8e7683be16e0410e59c465646c"} Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.894186 5008 scope.go:117] "RemoveContainer" containerID="bc05ad5a81c3457be482457c1712709f741ccb8e7683be16e0410e59c465646c" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.894253 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.894499 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.894888 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.895362 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.896408 5008 generic.go:334] "Generic (PLEG): container finished" podID="72dcab39-084a-40c0-8646-acf173ea065d" containerID="703975cab045914a73e041489c2576731e07258d37640438c70791f25ae67c2f" exitCode=1 Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.896518 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" event={"ID":"72dcab39-084a-40c0-8646-acf173ea065d","Type":"ContainerDied","Data":"703975cab045914a73e041489c2576731e07258d37640438c70791f25ae67c2f"} Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.897481 5008 scope.go:117] "RemoveContainer" containerID="703975cab045914a73e041489c2576731e07258d37640438c70791f25ae67c2f" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.897587 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.898009 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.898894 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.899298 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.899521 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.901523 5008 generic.go:334] "Generic (PLEG): container finished" podID="aae33cf9-f71c-4878-86c4-218de3173f3a" containerID="87768de0774080581a5c60ce4320bcfb5daa74d521e7e6e17601ed50b0ab799f" exitCode=1 Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.901649 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" event={"ID":"aae33cf9-f71c-4878-86c4-218de3173f3a","Type":"ContainerDied","Data":"87768de0774080581a5c60ce4320bcfb5daa74d521e7e6e17601ed50b0ab799f"} Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.902490 5008 scope.go:117] "RemoveContainer" containerID="87768de0774080581a5c60ce4320bcfb5daa74d521e7e6e17601ed50b0ab799f" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.902622 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.903107 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.903520 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.904059 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.904710 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.905388 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.906085 5008 generic.go:334] "Generic (PLEG): container finished" podID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" containerID="52bbce5f4fecf343d959649894234c07502df7f01fc7c536f0004f31828b5d98" exitCode=1 Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.906204 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" event={"ID":"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5","Type":"ContainerDied","Data":"52bbce5f4fecf343d959649894234c07502df7f01fc7c536f0004f31828b5d98"} Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.906908 5008 scope.go:117] "RemoveContainer" containerID="52bbce5f4fecf343d959649894234c07502df7f01fc7c536f0004f31828b5d98" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.907222 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.907616 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.908094 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.908754 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.909292 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.910052 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:49 crc kubenswrapper[5008]: I1126 23:02:49.910582 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.933826 5008 generic.go:334] "Generic (PLEG): container finished" podID="9309fbab-00dc-4e76-a384-b9297f098fe9" containerID="35576893d513cf3dd2135514bfd4cd05a3e2bd8157ec5e3978fe51affc6d23cf" exitCode=1 Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.933989 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" event={"ID":"9309fbab-00dc-4e76-a384-b9297f098fe9","Type":"ContainerDied","Data":"35576893d513cf3dd2135514bfd4cd05a3e2bd8157ec5e3978fe51affc6d23cf"} Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.935138 5008 scope.go:117] "RemoveContainer" containerID="35576893d513cf3dd2135514bfd4cd05a3e2bd8157ec5e3978fe51affc6d23cf" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.935476 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.935876 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.936224 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.936672 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.937323 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.937688 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.938293 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.938660 5008 generic.go:334] "Generic (PLEG): container finished" podID="72dcab39-084a-40c0-8646-acf173ea065d" containerID="303cf3d2ab99c50b9f24201859ac253edb4b750718e3e717aecdf73ae5835e30" exitCode=1 Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.938738 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" event={"ID":"72dcab39-084a-40c0-8646-acf173ea065d","Type":"ContainerDied","Data":"303cf3d2ab99c50b9f24201859ac253edb4b750718e3e717aecdf73ae5835e30"} Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.938756 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.938796 5008 scope.go:117] "RemoveContainer" containerID="703975cab045914a73e041489c2576731e07258d37640438c70791f25ae67c2f" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.939310 5008 scope.go:117] "RemoveContainer" containerID="303cf3d2ab99c50b9f24201859ac253edb4b750718e3e717aecdf73ae5835e30" Nov 26 23:02:50 crc kubenswrapper[5008]: E1126 23:02:50.939626 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=keystone-operator-controller-manager-7f6c555587-tvtn9_openstack-operators(72dcab39-084a-40c0-8646-acf173ea065d)\"" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" podUID="72dcab39-084a-40c0-8646-acf173ea065d" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.939782 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.940278 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.940787 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.941364 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.941760 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.942164 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.942532 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.942896 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.951456 5008 generic.go:334] "Generic (PLEG): container finished" podID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" containerID="1a2e495cca0fb5bc4a901d584656bf291b5af480354496e65e92345f2d68c1e7" exitCode=1 Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.951558 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" event={"ID":"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5","Type":"ContainerDied","Data":"1a2e495cca0fb5bc4a901d584656bf291b5af480354496e65e92345f2d68c1e7"} Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.952343 5008 scope.go:117] "RemoveContainer" containerID="1a2e495cca0fb5bc4a901d584656bf291b5af480354496e65e92345f2d68c1e7" Nov 26 23:02:50 crc kubenswrapper[5008]: E1126 23:02:50.952695 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=metallb-operator-controller-manager-7d66f7697f-2vlzj_metallb-system(9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5)\"" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.953298 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.953626 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.954138 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.954581 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.955181 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.955620 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.956053 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.957066 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.959384 5008 generic.go:334] "Generic (PLEG): container finished" podID="95e78ec8-1a94-47ca-b471-10ba505c5583" containerID="0a9db12963cb3ca17ff0916f648fc20398d31f12881d38d434dcd29997c52725" exitCode=1 Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.959489 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" event={"ID":"95e78ec8-1a94-47ca-b471-10ba505c5583","Type":"ContainerDied","Data":"0a9db12963cb3ca17ff0916f648fc20398d31f12881d38d434dcd29997c52725"} Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.960345 5008 scope.go:117] "RemoveContainer" containerID="0a9db12963cb3ca17ff0916f648fc20398d31f12881d38d434dcd29997c52725" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.960869 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.961410 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.962056 5008 status_manager.go:851] "Failed to get status for pod" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-76d465bf76-r74xk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.962559 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.963211 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.963436 5008 generic.go:334] "Generic (PLEG): container finished" podID="0280585a-1314-4eac-9fc6-d83aa687a4f4" containerID="d7113e37844cbe56c2bf691d9b421f0506f62fca35d414a3a654d6e48c6fb1db" exitCode=1 Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.963528 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" event={"ID":"0280585a-1314-4eac-9fc6-d83aa687a4f4","Type":"ContainerDied","Data":"d7113e37844cbe56c2bf691d9b421f0506f62fca35d414a3a654d6e48c6fb1db"} Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.963771 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.964173 5008 scope.go:117] "RemoveContainer" containerID="d7113e37844cbe56c2bf691d9b421f0506f62fca35d414a3a654d6e48c6fb1db" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.964316 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: E1126 23:02:50.965099 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=horizon-operator-controller-manager-5765b658-hkfvl_openstack-operators(0280585a-1314-4eac-9fc6-d83aa687a4f4)\"" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.965097 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.966149 5008 generic.go:334] "Generic (PLEG): container finished" podID="a593559a-2caa-41b9-86bd-5f290b91f6ae" containerID="a28909963a1540ccbf3b5c829e0936657a00b7d479af521835f778af184fc017" exitCode=1 Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.966161 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.966234 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" event={"ID":"a593559a-2caa-41b9-86bd-5f290b91f6ae","Type":"ContainerDied","Data":"a28909963a1540ccbf3b5c829e0936657a00b7d479af521835f778af184fc017"} Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.966697 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.966769 5008 scope.go:117] "RemoveContainer" containerID="a28909963a1540ccbf3b5c829e0936657a00b7d479af521835f778af184fc017" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.967097 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.967480 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.968164 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.968730 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.969153 5008 status_manager.go:851] "Failed to get status for pod" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-76d465bf76-r74xk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.969609 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.970325 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.970849 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.971839 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.972080 5008 generic.go:334] "Generic (PLEG): container finished" podID="d0c58bd4-2da5-4770-b562-aad453776b10" containerID="acaf907aeef1d298522c53c55ac006672d3d7b3475784778e7a8ac3ecf54ad2c" exitCode=1 Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.972166 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" event={"ID":"d0c58bd4-2da5-4770-b562-aad453776b10","Type":"ContainerDied","Data":"acaf907aeef1d298522c53c55ac006672d3d7b3475784778e7a8ac3ecf54ad2c"} Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.972616 5008 scope.go:117] "RemoveContainer" containerID="acaf907aeef1d298522c53c55ac006672d3d7b3475784778e7a8ac3ecf54ad2c" Nov 26 23:02:50 crc kubenswrapper[5008]: E1126 23:02:50.972921 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=operator pod=rabbitmq-cluster-operator-779fc9694b-hg8m4_openstack-operators(d0c58bd4-2da5-4770-b562-aad453776b10)\"" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.974085 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.974707 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.975571 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.976236 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.976803 5008 status_manager.go:851] "Failed to get status for pod" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-76d465bf76-r74xk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.977484 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.978278 5008 generic.go:334] "Generic (PLEG): container finished" podID="aae33cf9-f71c-4878-86c4-218de3173f3a" containerID="4926f3fa2a4ba11f026f9ed936cb85b32d8be5faff2cd99cd659a4d759cfcb84" exitCode=1 Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.978244 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.978313 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" event={"ID":"aae33cf9-f71c-4878-86c4-218de3173f3a","Type":"ContainerDied","Data":"4926f3fa2a4ba11f026f9ed936cb85b32d8be5faff2cd99cd659a4d759cfcb84"} Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.979821 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.980437 5008 status_manager.go:851] "Failed to get status for pod" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-75b97bfb54-shcmm\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.980529 5008 scope.go:117] "RemoveContainer" containerID="4926f3fa2a4ba11f026f9ed936cb85b32d8be5faff2cd99cd659a4d759cfcb84" Nov 26 23:02:50 crc kubenswrapper[5008]: E1126 23:02:50.981391 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=swift-operator-controller-manager-86cc6c797c-xj5wk_openstack-operators(aae33cf9-f71c-4878-86c4-218de3173f3a)\"" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.981526 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.982022 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.982559 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.983103 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.983439 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.983882 5008 status_manager.go:851] "Failed to get status for pod" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-76d465bf76-r74xk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.984677 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.985083 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.985462 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:50 crc kubenswrapper[5008]: I1126 23:02:50.985863 5008 status_manager.go:851] "Failed to get status for pod" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-75b97bfb54-shcmm\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.032568 5008 scope.go:117] "RemoveContainer" containerID="52bbce5f4fecf343d959649894234c07502df7f01fc7c536f0004f31828b5d98" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.033004 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.123392 5008 scope.go:117] "RemoveContainer" containerID="8259b0b4910f9a2fe20ae8be5de9bfc9596aa13c2cf2425ec7d1c1c2daa0662c" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.165136 5008 scope.go:117] "RemoveContainer" containerID="bc05ad5a81c3457be482457c1712709f741ccb8e7683be16e0410e59c465646c" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.211549 5008 scope.go:117] "RemoveContainer" containerID="87768de0774080581a5c60ce4320bcfb5daa74d521e7e6e17601ed50b0ab799f" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.518034 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.519408 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.520007 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.520779 5008 status_manager.go:851] "Failed to get status for pod" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-75b97bfb54-shcmm\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.521450 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.521912 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.522498 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.523056 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.523519 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.524012 5008 status_manager.go:851] "Failed to get status for pod" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-76d465bf76-r74xk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.524539 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.545885 5008 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca6251c6-3571-4928-a464-ab761a51d240" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.545928 5008 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca6251c6-3571-4928-a464-ab761a51d240" Nov 26 23:02:51 crc kubenswrapper[5008]: E1126 23:02:51.546563 5008 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.547450 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:51 crc kubenswrapper[5008]: W1126 23:02:51.584665 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-f8dd0aeae230d3bc57e825e49da5d45827e2a792366123120e4032c61e363ef2 WatchSource:0}: Error finding container f8dd0aeae230d3bc57e825e49da5d45827e2a792366123120e4032c61e363ef2: Status 404 returned error can't find the container with id f8dd0aeae230d3bc57e825e49da5d45827e2a792366123120e4032c61e363ef2 Nov 26 23:02:51 crc kubenswrapper[5008]: I1126 23:02:51.623701 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.005376 5008 generic.go:334] "Generic (PLEG): container finished" podID="9309fbab-00dc-4e76-a384-b9297f098fe9" containerID="0abbfc6bb258bfa371e1350a69f1897c26c72c251359e79a8b88408682157837" exitCode=1 Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.007105 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" event={"ID":"9309fbab-00dc-4e76-a384-b9297f098fe9","Type":"ContainerDied","Data":"0abbfc6bb258bfa371e1350a69f1897c26c72c251359e79a8b88408682157837"} Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.007350 5008 scope.go:117] "RemoveContainer" containerID="35576893d513cf3dd2135514bfd4cd05a3e2bd8157ec5e3978fe51affc6d23cf" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.009344 5008 scope.go:117] "RemoveContainer" containerID="0abbfc6bb258bfa371e1350a69f1897c26c72c251359e79a8b88408682157837" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.011004 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: E1126 23:02:52.010830 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=infra-operator-controller-manager-5476c5fbf7-zg2vt_openstack-operators(9309fbab-00dc-4e76-a384-b9297f098fe9)\"" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.011860 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.012882 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.013418 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.014135 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.014597 5008 status_manager.go:851] "Failed to get status for pod" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-76d465bf76-r74xk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.015136 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.015687 5008 generic.go:334] "Generic (PLEG): container finished" podID="95e78ec8-1a94-47ca-b471-10ba505c5583" containerID="94fbd05470d91b12c8f82de8b5863429ee9e4c3406ad075b5db841e8a127db03" exitCode=1 Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.015867 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" event={"ID":"95e78ec8-1a94-47ca-b471-10ba505c5583","Type":"ContainerDied","Data":"94fbd05470d91b12c8f82de8b5863429ee9e4c3406ad075b5db841e8a127db03"} Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.015893 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.016492 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.016830 5008 status_manager.go:851] "Failed to get status for pod" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-75b97bfb54-shcmm\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.016908 5008 scope.go:117] "RemoveContainer" containerID="94fbd05470d91b12c8f82de8b5863429ee9e4c3406ad075b5db841e8a127db03" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.017394 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: E1126 23:02:52.017481 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=glance-operator-controller-manager-76d465bf76-r74xk_openstack-operators(95e78ec8-1a94-47ca-b471-10ba505c5583)\"" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.017716 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.018053 5008 status_manager.go:851] "Failed to get status for pod" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-76d465bf76-r74xk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.018382 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.019136 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.019823 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f8dd0aeae230d3bc57e825e49da5d45827e2a792366123120e4032c61e363ef2"} Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.020220 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.020734 5008 status_manager.go:851] "Failed to get status for pod" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-75b97bfb54-shcmm\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.021654 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.022407 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.022796 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.027087 5008 generic.go:334] "Generic (PLEG): container finished" podID="a593559a-2caa-41b9-86bd-5f290b91f6ae" containerID="d24e00d6b89b869314590fb24edbf9f9349cfc7f5a49f74e3f74db57beaba01b" exitCode=1 Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.027198 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" event={"ID":"a593559a-2caa-41b9-86bd-5f290b91f6ae","Type":"ContainerDied","Data":"d24e00d6b89b869314590fb24edbf9f9349cfc7f5a49f74e3f74db57beaba01b"} Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.027869 5008 scope.go:117] "RemoveContainer" containerID="d24e00d6b89b869314590fb24edbf9f9349cfc7f5a49f74e3f74db57beaba01b" Nov 26 23:02:52 crc kubenswrapper[5008]: E1126 23:02:52.028287 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=mariadb-operator-controller-manager-75b97bfb54-shcmm_openstack-operators(a593559a-2caa-41b9-86bd-5f290b91f6ae)\"" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.028562 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.029020 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.029451 5008 status_manager.go:851] "Failed to get status for pod" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-75b97bfb54-shcmm\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.030864 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.032038 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.032799 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.033383 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.037233 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.038266 5008 status_manager.go:851] "Failed to get status for pod" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-76d465bf76-r74xk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.038916 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.041040 5008 scope.go:117] "RemoveContainer" containerID="4926f3fa2a4ba11f026f9ed936cb85b32d8be5faff2cd99cd659a4d759cfcb84" Nov 26 23:02:52 crc kubenswrapper[5008]: E1126 23:02:52.041464 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=swift-operator-controller-manager-86cc6c797c-xj5wk_openstack-operators(aae33cf9-f71c-4878-86c4-218de3173f3a)\"" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.041464 5008 status_manager.go:851] "Failed to get status for pod" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-76d465bf76-r74xk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.043189 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.043816 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.044471 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.045163 5008 status_manager.go:851] "Failed to get status for pod" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-75b97bfb54-shcmm\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.045747 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.046286 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.046868 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.047478 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.048049 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.146676 5008 scope.go:117] "RemoveContainer" containerID="0a9db12963cb3ca17ff0916f648fc20398d31f12881d38d434dcd29997c52725" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.212790 5008 scope.go:117] "RemoveContainer" containerID="a28909963a1540ccbf3b5c829e0936657a00b7d479af521835f778af184fc017" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.659641 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.660786 5008 scope.go:117] "RemoveContainer" containerID="1a2e495cca0fb5bc4a901d584656bf291b5af480354496e65e92345f2d68c1e7" Nov 26 23:02:52 crc kubenswrapper[5008]: E1126 23:02:52.661255 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=metallb-operator-controller-manager-7d66f7697f-2vlzj_metallb-system(9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5)\"" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.661545 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.662678 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.663430 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.664015 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.664531 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.664909 5008 status_manager.go:851] "Failed to get status for pod" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-76d465bf76-r74xk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.665280 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.665624 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.665934 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:52 crc kubenswrapper[5008]: I1126 23:02:52.666311 5008 status_manager.go:851] "Failed to get status for pod" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-75b97bfb54-shcmm\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.054503 5008 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="5aa58b6bc33eacc823276f6eed211f5f05b758c817b1e0f445b23f35a0c5728f" exitCode=0 Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.054617 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"5aa58b6bc33eacc823276f6eed211f5f05b758c817b1e0f445b23f35a0c5728f"} Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.055250 5008 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca6251c6-3571-4928-a464-ab761a51d240" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.055365 5008 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca6251c6-3571-4928-a464-ab761a51d240" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.056059 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: E1126 23:02:53.056284 5008 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.056505 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.056884 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.057337 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.057825 5008 status_manager.go:851] "Failed to get status for pod" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-76d465bf76-r74xk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.058502 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.059060 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.059536 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.060166 5008 status_manager.go:851] "Failed to get status for pod" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-75b97bfb54-shcmm\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.060391 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.060462 5008 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61" exitCode=1 Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.060578 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61"} Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.060866 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.061497 5008 scope.go:117] "RemoveContainer" containerID="b6c55895e235986fd4ff51bbda0d8cf8162fec74bc8842b2bb53842d8a1f2b61" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.061600 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.062152 5008 status_manager.go:851] "Failed to get status for pod" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-76d465bf76-r74xk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.062617 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.063234 5008 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.063765 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.064496 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.064898 5008 status_manager.go:851] "Failed to get status for pod" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-75b97bfb54-shcmm\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.066931 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.067587 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.070908 5008 scope.go:117] "RemoveContainer" containerID="d24e00d6b89b869314590fb24edbf9f9349cfc7f5a49f74e3f74db57beaba01b" Nov 26 23:02:53 crc kubenswrapper[5008]: E1126 23:02:53.071360 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=mariadb-operator-controller-manager-75b97bfb54-shcmm_openstack-operators(a593559a-2caa-41b9-86bd-5f290b91f6ae)\"" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.076165 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.076838 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.077540 5008 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.078048 5008 status_manager.go:851] "Failed to get status for pod" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-5765b658-hkfvl\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.078514 5008 status_manager.go:851] "Failed to get status for pod" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-86cc6c797c-xj5wk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.078957 5008 status_manager.go:851] "Failed to get status for pod" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-75b97bfb54-shcmm\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.079453 5008 status_manager.go:851] "Failed to get status for pod" podUID="72dcab39-084a-40c0-8646-acf173ea065d" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-7f6c555587-tvtn9\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.079988 5008 status_manager.go:851] "Failed to get status for pod" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-5476c5fbf7-zg2vt\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.080481 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.080916 5008 status_manager.go:851] "Failed to get status for pod" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-7d66f7697f-2vlzj\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.081550 5008 status_manager.go:851] "Failed to get status for pod" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-hg8m4\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.082056 5008 status_manager.go:851] "Failed to get status for pod" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-76d465bf76-r74xk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.082572 5008 status_manager.go:851] "Failed to get status for pod" podUID="1308c6b2-bbac-4fa2-a8cb-7a3028f05ae5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 26 23:02:53 crc kubenswrapper[5008]: I1126 23:02:53.190351 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 23:02:54 crc kubenswrapper[5008]: I1126 23:02:54.091267 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8fe8c28a47fabccb028a1b34047f33bf85e3759e9bc7361e51233eedf0bdeafc"} Nov 26 23:02:54 crc kubenswrapper[5008]: I1126 23:02:54.091624 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2867c74af208ae3214a9c3c693d07955825551833aa6edcf6fcbf0e3d70effe2"} Nov 26 23:02:54 crc kubenswrapper[5008]: I1126 23:02:54.091647 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"575fba48e54ef2ce167fb63d6861ab65417a7f5536e0e0c60f4664480b30b798"} Nov 26 23:02:54 crc kubenswrapper[5008]: I1126 23:02:54.102326 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 26 23:02:54 crc kubenswrapper[5008]: I1126 23:02:54.102411 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9a8e99fa7c260779ec2b072cddac478ea42540cdd8217939731bff757bf24fd8"} Nov 26 23:02:54 crc kubenswrapper[5008]: I1126 23:02:54.661855 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 23:02:54 crc kubenswrapper[5008]: I1126 23:02:54.661896 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 23:02:54 crc kubenswrapper[5008]: I1126 23:02:54.662403 5008 scope.go:117] "RemoveContainer" containerID="d7113e37844cbe56c2bf691d9b421f0506f62fca35d414a3a654d6e48c6fb1db" Nov 26 23:02:54 crc kubenswrapper[5008]: E1126 23:02:54.662599 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=horizon-operator-controller-manager-5765b658-hkfvl_openstack-operators(0280585a-1314-4eac-9fc6-d83aa687a4f4)\"" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" Nov 26 23:02:54 crc kubenswrapper[5008]: I1126 23:02:54.707278 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 23:02:54 crc kubenswrapper[5008]: I1126 23:02:54.708148 5008 scope.go:117] "RemoveContainer" containerID="94fbd05470d91b12c8f82de8b5863429ee9e4c3406ad075b5db841e8a127db03" Nov 26 23:02:54 crc kubenswrapper[5008]: E1126 23:02:54.708388 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=glance-operator-controller-manager-76d465bf76-r74xk_openstack-operators(95e78ec8-1a94-47ca-b471-10ba505c5583)\"" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" Nov 26 23:02:55 crc kubenswrapper[5008]: I1126 23:02:55.113904 5008 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca6251c6-3571-4928-a464-ab761a51d240" Nov 26 23:02:55 crc kubenswrapper[5008]: I1126 23:02:55.113939 5008 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca6251c6-3571-4928-a464-ab761a51d240" Nov 26 23:02:55 crc kubenswrapper[5008]: I1126 23:02:55.114249 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b3e3edf6e59017d7edf0f84fd0d706c8eecdff546273a325f5bb98d8229b996c"} Nov 26 23:02:55 crc kubenswrapper[5008]: I1126 23:02:55.114275 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4c1883d1dfe2aeb6857942df0c4a447ec1feec439af15843ba54ec04373e6e83"} Nov 26 23:02:55 crc kubenswrapper[5008]: I1126 23:02:55.114308 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:55 crc kubenswrapper[5008]: I1126 23:02:55.691008 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 23:02:56 crc kubenswrapper[5008]: I1126 23:02:56.547777 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:56 crc kubenswrapper[5008]: I1126 23:02:56.548145 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:02:56 crc kubenswrapper[5008]: I1126 23:02:56.553845 5008 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]log ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]etcd ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-api-request-count-filter ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-startkubeinformers ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/generic-apiserver-start-informers ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/priority-and-fairness-config-consumer ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/priority-and-fairness-filter ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/start-apiextensions-informers ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/start-apiextensions-controllers ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/crd-informer-synced ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/start-system-namespaces-controller ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/start-cluster-authentication-info-controller ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/start-legacy-token-tracking-controller ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/start-service-ip-repair-controllers ok Nov 26 23:02:56 crc kubenswrapper[5008]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/priority-and-fairness-config-producer ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/bootstrap-controller ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/start-kube-aggregator-informers ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/apiservice-status-local-available-controller ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/apiservice-status-remote-available-controller ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/apiservice-registration-controller ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/apiservice-wait-for-first-sync ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/apiservice-discovery-controller ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/kube-apiserver-autoregistration ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]autoregister-completion ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/apiservice-openapi-controller ok Nov 26 23:02:56 crc kubenswrapper[5008]: [+]poststarthook/apiservice-openapiv3-controller ok Nov 26 23:02:56 crc kubenswrapper[5008]: livez check failed Nov 26 23:02:56 crc kubenswrapper[5008]: I1126 23:02:56.553915 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 23:02:57 crc kubenswrapper[5008]: I1126 23:02:57.279231 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 23:02:57 crc kubenswrapper[5008]: I1126 23:02:57.280227 5008 scope.go:117] "RemoveContainer" containerID="0abbfc6bb258bfa371e1350a69f1897c26c72c251359e79a8b88408682157837" Nov 26 23:02:57 crc kubenswrapper[5008]: E1126 23:02:57.280613 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=infra-operator-controller-manager-5476c5fbf7-zg2vt_openstack-operators(9309fbab-00dc-4e76-a384-b9297f098fe9)\"" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" Nov 26 23:02:57 crc kubenswrapper[5008]: I1126 23:02:57.566602 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 23:02:57 crc kubenswrapper[5008]: I1126 23:02:57.566683 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 23:02:57 crc kubenswrapper[5008]: I1126 23:02:57.567616 5008 scope.go:117] "RemoveContainer" containerID="303cf3d2ab99c50b9f24201859ac253edb4b750718e3e717aecdf73ae5835e30" Nov 26 23:02:57 crc kubenswrapper[5008]: E1126 23:02:57.568133 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=keystone-operator-controller-manager-7f6c555587-tvtn9_openstack-operators(72dcab39-084a-40c0-8646-acf173ea065d)\"" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" podUID="72dcab39-084a-40c0-8646-acf173ea065d" Nov 26 23:02:59 crc kubenswrapper[5008]: I1126 23:02:59.281481 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 23:02:59 crc kubenswrapper[5008]: I1126 23:02:59.281838 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 23:03:00 crc kubenswrapper[5008]: I1126 23:03:00.147759 5008 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:03:00 crc kubenswrapper[5008]: I1126 23:03:00.474547 5008 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a6072126-0d9e-4144-b50e-9e920295aac0" Nov 26 23:03:01 crc kubenswrapper[5008]: I1126 23:03:01.033120 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 23:03:01 crc kubenswrapper[5008]: I1126 23:03:01.033629 5008 scope.go:117] "RemoveContainer" containerID="4926f3fa2a4ba11f026f9ed936cb85b32d8be5faff2cd99cd659a4d759cfcb84" Nov 26 23:03:01 crc kubenswrapper[5008]: I1126 23:03:01.177728 5008 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca6251c6-3571-4928-a464-ab761a51d240" Nov 26 23:03:01 crc kubenswrapper[5008]: I1126 23:03:01.177772 5008 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca6251c6-3571-4928-a464-ab761a51d240" Nov 26 23:03:01 crc kubenswrapper[5008]: I1126 23:03:01.180684 5008 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a6072126-0d9e-4144-b50e-9e920295aac0" Nov 26 23:03:01 crc kubenswrapper[5008]: I1126 23:03:01.623868 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 23:03:01 crc kubenswrapper[5008]: I1126 23:03:01.624231 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 23:03:01 crc kubenswrapper[5008]: I1126 23:03:01.625179 5008 scope.go:117] "RemoveContainer" containerID="d24e00d6b89b869314590fb24edbf9f9349cfc7f5a49f74e3f74db57beaba01b" Nov 26 23:03:02 crc kubenswrapper[5008]: I1126 23:03:02.193640 5008 generic.go:334] "Generic (PLEG): container finished" podID="a593559a-2caa-41b9-86bd-5f290b91f6ae" containerID="cfb8800e64c0b7159f09b18917f5890a1f6db442bd40a08f6584c39513b07a6c" exitCode=1 Nov 26 23:03:02 crc kubenswrapper[5008]: I1126 23:03:02.193751 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" event={"ID":"a593559a-2caa-41b9-86bd-5f290b91f6ae","Type":"ContainerDied","Data":"cfb8800e64c0b7159f09b18917f5890a1f6db442bd40a08f6584c39513b07a6c"} Nov 26 23:03:02 crc kubenswrapper[5008]: I1126 23:03:02.193821 5008 scope.go:117] "RemoveContainer" containerID="d24e00d6b89b869314590fb24edbf9f9349cfc7f5a49f74e3f74db57beaba01b" Nov 26 23:03:02 crc kubenswrapper[5008]: I1126 23:03:02.194697 5008 scope.go:117] "RemoveContainer" containerID="cfb8800e64c0b7159f09b18917f5890a1f6db442bd40a08f6584c39513b07a6c" Nov 26 23:03:02 crc kubenswrapper[5008]: E1126 23:03:02.195229 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=mariadb-operator-controller-manager-75b97bfb54-shcmm_openstack-operators(a593559a-2caa-41b9-86bd-5f290b91f6ae)\"" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" Nov 26 23:03:02 crc kubenswrapper[5008]: I1126 23:03:02.197655 5008 generic.go:334] "Generic (PLEG): container finished" podID="aae33cf9-f71c-4878-86c4-218de3173f3a" containerID="a09fcee0f5c0edbc112615a422820e105a94d7c6aff3cfac7a2719894397ff9f" exitCode=1 Nov 26 23:03:02 crc kubenswrapper[5008]: I1126 23:03:02.197711 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" event={"ID":"aae33cf9-f71c-4878-86c4-218de3173f3a","Type":"ContainerDied","Data":"a09fcee0f5c0edbc112615a422820e105a94d7c6aff3cfac7a2719894397ff9f"} Nov 26 23:03:02 crc kubenswrapper[5008]: I1126 23:03:02.198543 5008 scope.go:117] "RemoveContainer" containerID="a09fcee0f5c0edbc112615a422820e105a94d7c6aff3cfac7a2719894397ff9f" Nov 26 23:03:02 crc kubenswrapper[5008]: E1126 23:03:02.199036 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=swift-operator-controller-manager-86cc6c797c-xj5wk_openstack-operators(aae33cf9-f71c-4878-86c4-218de3173f3a)\"" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" Nov 26 23:03:02 crc kubenswrapper[5008]: I1126 23:03:02.269742 5008 scope.go:117] "RemoveContainer" containerID="4926f3fa2a4ba11f026f9ed936cb85b32d8be5faff2cd99cd659a4d759cfcb84" Nov 26 23:03:02 crc kubenswrapper[5008]: I1126 23:03:02.519433 5008 scope.go:117] "RemoveContainer" containerID="acaf907aeef1d298522c53c55ac006672d3d7b3475784778e7a8ac3ecf54ad2c" Nov 26 23:03:03 crc kubenswrapper[5008]: I1126 23:03:03.190560 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 23:03:03 crc kubenswrapper[5008]: I1126 23:03:03.200078 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 23:03:03 crc kubenswrapper[5008]: I1126 23:03:03.225352 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" event={"ID":"d0c58bd4-2da5-4770-b562-aad453776b10","Type":"ContainerStarted","Data":"cec3a7f0019d93ec1acf5547bc2a58306b2dbc62023e324be6e69b03a6661f03"} Nov 26 23:03:03 crc kubenswrapper[5008]: I1126 23:03:03.240592 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 23:03:04 crc kubenswrapper[5008]: I1126 23:03:04.246708 5008 generic.go:334] "Generic (PLEG): container finished" podID="d0c58bd4-2da5-4770-b562-aad453776b10" containerID="cec3a7f0019d93ec1acf5547bc2a58306b2dbc62023e324be6e69b03a6661f03" exitCode=1 Nov 26 23:03:04 crc kubenswrapper[5008]: I1126 23:03:04.246858 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" event={"ID":"d0c58bd4-2da5-4770-b562-aad453776b10","Type":"ContainerDied","Data":"cec3a7f0019d93ec1acf5547bc2a58306b2dbc62023e324be6e69b03a6661f03"} Nov 26 23:03:04 crc kubenswrapper[5008]: I1126 23:03:04.247034 5008 scope.go:117] "RemoveContainer" containerID="acaf907aeef1d298522c53c55ac006672d3d7b3475784778e7a8ac3ecf54ad2c" Nov 26 23:03:04 crc kubenswrapper[5008]: I1126 23:03:04.248073 5008 scope.go:117] "RemoveContainer" containerID="cec3a7f0019d93ec1acf5547bc2a58306b2dbc62023e324be6e69b03a6661f03" Nov 26 23:03:04 crc kubenswrapper[5008]: E1126 23:03:04.248695 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=operator pod=rabbitmq-cluster-operator-779fc9694b-hg8m4_openstack-operators(d0c58bd4-2da5-4770-b562-aad453776b10)\"" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" Nov 26 23:03:04 crc kubenswrapper[5008]: I1126 23:03:04.707140 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 23:03:04 crc kubenswrapper[5008]: I1126 23:03:04.708045 5008 scope.go:117] "RemoveContainer" containerID="94fbd05470d91b12c8f82de8b5863429ee9e4c3406ad075b5db841e8a127db03" Nov 26 23:03:05 crc kubenswrapper[5008]: I1126 23:03:05.266370 5008 generic.go:334] "Generic (PLEG): container finished" podID="95e78ec8-1a94-47ca-b471-10ba505c5583" containerID="b80ed22f44e34b4dd44dc2a17d123db6a473c4b94b630f33f1e91ea8b3cf3253" exitCode=1 Nov 26 23:03:05 crc kubenswrapper[5008]: I1126 23:03:05.266432 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" event={"ID":"95e78ec8-1a94-47ca-b471-10ba505c5583","Type":"ContainerDied","Data":"b80ed22f44e34b4dd44dc2a17d123db6a473c4b94b630f33f1e91ea8b3cf3253"} Nov 26 23:03:05 crc kubenswrapper[5008]: I1126 23:03:05.266484 5008 scope.go:117] "RemoveContainer" containerID="94fbd05470d91b12c8f82de8b5863429ee9e4c3406ad075b5db841e8a127db03" Nov 26 23:03:05 crc kubenswrapper[5008]: I1126 23:03:05.267495 5008 scope.go:117] "RemoveContainer" containerID="b80ed22f44e34b4dd44dc2a17d123db6a473c4b94b630f33f1e91ea8b3cf3253" Nov 26 23:03:05 crc kubenswrapper[5008]: E1126 23:03:05.267940 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=glance-operator-controller-manager-76d465bf76-r74xk_openstack-operators(95e78ec8-1a94-47ca-b471-10ba505c5583)\"" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" Nov 26 23:03:06 crc kubenswrapper[5008]: I1126 23:03:06.874254 5008 scope.go:117] "RemoveContainer" containerID="d7113e37844cbe56c2bf691d9b421f0506f62fca35d414a3a654d6e48c6fb1db" Nov 26 23:03:07 crc kubenswrapper[5008]: I1126 23:03:07.279214 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 23:03:07 crc kubenswrapper[5008]: I1126 23:03:07.281654 5008 scope.go:117] "RemoveContainer" containerID="0abbfc6bb258bfa371e1350a69f1897c26c72c251359e79a8b88408682157837" Nov 26 23:03:07 crc kubenswrapper[5008]: I1126 23:03:07.294316 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" event={"ID":"0280585a-1314-4eac-9fc6-d83aa687a4f4","Type":"ContainerStarted","Data":"80989502d207313e447e88e5d19ccee3fd036ade78c0c704099614965f715e13"} Nov 26 23:03:07 crc kubenswrapper[5008]: I1126 23:03:07.295254 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 23:03:07 crc kubenswrapper[5008]: I1126 23:03:07.518995 5008 scope.go:117] "RemoveContainer" containerID="1a2e495cca0fb5bc4a901d584656bf291b5af480354496e65e92345f2d68c1e7" Nov 26 23:03:08 crc kubenswrapper[5008]: I1126 23:03:08.311332 5008 generic.go:334] "Generic (PLEG): container finished" podID="9309fbab-00dc-4e76-a384-b9297f098fe9" containerID="186bdeccc77766b9f6f4646d0f1ac8b578e5875f33c3484214a87f7e351f0a8c" exitCode=1 Nov 26 23:03:08 crc kubenswrapper[5008]: I1126 23:03:08.311398 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" event={"ID":"9309fbab-00dc-4e76-a384-b9297f098fe9","Type":"ContainerDied","Data":"186bdeccc77766b9f6f4646d0f1ac8b578e5875f33c3484214a87f7e351f0a8c"} Nov 26 23:03:08 crc kubenswrapper[5008]: I1126 23:03:08.311758 5008 scope.go:117] "RemoveContainer" containerID="0abbfc6bb258bfa371e1350a69f1897c26c72c251359e79a8b88408682157837" Nov 26 23:03:08 crc kubenswrapper[5008]: I1126 23:03:08.312466 5008 scope.go:117] "RemoveContainer" containerID="186bdeccc77766b9f6f4646d0f1ac8b578e5875f33c3484214a87f7e351f0a8c" Nov 26 23:03:08 crc kubenswrapper[5008]: E1126 23:03:08.312924 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=infra-operator-controller-manager-5476c5fbf7-zg2vt_openstack-operators(9309fbab-00dc-4e76-a384-b9297f098fe9)\"" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" Nov 26 23:03:08 crc kubenswrapper[5008]: I1126 23:03:08.318350 5008 generic.go:334] "Generic (PLEG): container finished" podID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" containerID="c7d368ff4d915d18db481b35695be0e928b815aa52824929775bbc61499cb266" exitCode=1 Nov 26 23:03:08 crc kubenswrapper[5008]: I1126 23:03:08.318409 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" event={"ID":"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5","Type":"ContainerDied","Data":"c7d368ff4d915d18db481b35695be0e928b815aa52824929775bbc61499cb266"} Nov 26 23:03:08 crc kubenswrapper[5008]: I1126 23:03:08.318778 5008 scope.go:117] "RemoveContainer" containerID="c7d368ff4d915d18db481b35695be0e928b815aa52824929775bbc61499cb266" Nov 26 23:03:08 crc kubenswrapper[5008]: E1126 23:03:08.319065 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=metallb-operator-controller-manager-7d66f7697f-2vlzj_metallb-system(9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5)\"" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" Nov 26 23:03:08 crc kubenswrapper[5008]: I1126 23:03:08.322347 5008 generic.go:334] "Generic (PLEG): container finished" podID="0280585a-1314-4eac-9fc6-d83aa687a4f4" containerID="80989502d207313e447e88e5d19ccee3fd036ade78c0c704099614965f715e13" exitCode=1 Nov 26 23:03:08 crc kubenswrapper[5008]: I1126 23:03:08.322395 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" event={"ID":"0280585a-1314-4eac-9fc6-d83aa687a4f4","Type":"ContainerDied","Data":"80989502d207313e447e88e5d19ccee3fd036ade78c0c704099614965f715e13"} Nov 26 23:03:08 crc kubenswrapper[5008]: I1126 23:03:08.323343 5008 scope.go:117] "RemoveContainer" containerID="80989502d207313e447e88e5d19ccee3fd036ade78c0c704099614965f715e13" Nov 26 23:03:08 crc kubenswrapper[5008]: E1126 23:03:08.323816 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=horizon-operator-controller-manager-5765b658-hkfvl_openstack-operators(0280585a-1314-4eac-9fc6-d83aa687a4f4)\"" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" Nov 26 23:03:08 crc kubenswrapper[5008]: I1126 23:03:08.403775 5008 scope.go:117] "RemoveContainer" containerID="1a2e495cca0fb5bc4a901d584656bf291b5af480354496e65e92345f2d68c1e7" Nov 26 23:03:08 crc kubenswrapper[5008]: I1126 23:03:08.446990 5008 scope.go:117] "RemoveContainer" containerID="d7113e37844cbe56c2bf691d9b421f0506f62fca35d414a3a654d6e48c6fb1db" Nov 26 23:03:09 crc kubenswrapper[5008]: I1126 23:03:09.342669 5008 scope.go:117] "RemoveContainer" containerID="80989502d207313e447e88e5d19ccee3fd036ade78c0c704099614965f715e13" Nov 26 23:03:09 crc kubenswrapper[5008]: E1126 23:03:09.344287 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=horizon-operator-controller-manager-5765b658-hkfvl_openstack-operators(0280585a-1314-4eac-9fc6-d83aa687a4f4)\"" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" Nov 26 23:03:10 crc kubenswrapper[5008]: I1126 23:03:10.309623 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 26 23:03:10 crc kubenswrapper[5008]: I1126 23:03:10.780470 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 26 23:03:10 crc kubenswrapper[5008]: I1126 23:03:10.797617 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-5g58j" Nov 26 23:03:10 crc kubenswrapper[5008]: I1126 23:03:10.801514 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 26 23:03:11 crc kubenswrapper[5008]: I1126 23:03:11.004255 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 26 23:03:11 crc kubenswrapper[5008]: I1126 23:03:11.033790 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 23:03:11 crc kubenswrapper[5008]: I1126 23:03:11.034751 5008 scope.go:117] "RemoveContainer" containerID="a09fcee0f5c0edbc112615a422820e105a94d7c6aff3cfac7a2719894397ff9f" Nov 26 23:03:11 crc kubenswrapper[5008]: E1126 23:03:11.035269 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=swift-operator-controller-manager-86cc6c797c-xj5wk_openstack-operators(aae33cf9-f71c-4878-86c4-218de3173f3a)\"" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" Nov 26 23:03:11 crc kubenswrapper[5008]: I1126 23:03:11.040504 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 26 23:03:11 crc kubenswrapper[5008]: I1126 23:03:11.367620 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 26 23:03:11 crc kubenswrapper[5008]: I1126 23:03:11.436645 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 26 23:03:11 crc kubenswrapper[5008]: I1126 23:03:11.479155 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 26 23:03:11 crc kubenswrapper[5008]: I1126 23:03:11.624274 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 23:03:11 crc kubenswrapper[5008]: I1126 23:03:11.625143 5008 scope.go:117] "RemoveContainer" containerID="cfb8800e64c0b7159f09b18917f5890a1f6db442bd40a08f6584c39513b07a6c" Nov 26 23:03:11 crc kubenswrapper[5008]: E1126 23:03:11.625451 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=mariadb-operator-controller-manager-75b97bfb54-shcmm_openstack-operators(a593559a-2caa-41b9-86bd-5f290b91f6ae)\"" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" Nov 26 23:03:11 crc kubenswrapper[5008]: I1126 23:03:11.883083 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-8xvhw" Nov 26 23:03:11 crc kubenswrapper[5008]: I1126 23:03:11.920098 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lbp6p" Nov 26 23:03:11 crc kubenswrapper[5008]: I1126 23:03:11.975633 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 26 23:03:11 crc kubenswrapper[5008]: I1126 23:03:11.988125 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.003347 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.031330 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.101055 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.144702 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.168116 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.189130 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.261083 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.328688 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.492193 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.518267 5008 scope.go:117] "RemoveContainer" containerID="303cf3d2ab99c50b9f24201859ac253edb4b750718e3e717aecdf73ae5835e30" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.524120 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.537338 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.550653 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.614774 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.624012 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.637362 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.645585 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.659433 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.660170 5008 scope.go:117] "RemoveContainer" containerID="c7d368ff4d915d18db481b35695be0e928b815aa52824929775bbc61499cb266" Nov 26 23:03:12 crc kubenswrapper[5008]: E1126 23:03:12.660487 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=metallb-operator-controller-manager-7d66f7697f-2vlzj_metallb-system(9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5)\"" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.687456 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.687847 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.848326 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 26 23:03:12 crc kubenswrapper[5008]: I1126 23:03:12.869358 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.160809 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.345624 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.393389 5008 generic.go:334] "Generic (PLEG): container finished" podID="72dcab39-084a-40c0-8646-acf173ea065d" containerID="87df0209013418ff1f7b2ae433ff375a3ec956d39448e870540444f03dbbfaf0" exitCode=1 Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.393591 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" event={"ID":"72dcab39-084a-40c0-8646-acf173ea065d","Type":"ContainerDied","Data":"87df0209013418ff1f7b2ae433ff375a3ec956d39448e870540444f03dbbfaf0"} Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.393923 5008 scope.go:117] "RemoveContainer" containerID="303cf3d2ab99c50b9f24201859ac253edb4b750718e3e717aecdf73ae5835e30" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.394999 5008 scope.go:117] "RemoveContainer" containerID="87df0209013418ff1f7b2ae433ff375a3ec956d39448e870540444f03dbbfaf0" Nov 26 23:03:13 crc kubenswrapper[5008]: E1126 23:03:13.395569 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=keystone-operator-controller-manager-7f6c555587-tvtn9_openstack-operators(72dcab39-084a-40c0-8646-acf173ea065d)\"" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" podUID="72dcab39-084a-40c0-8646-acf173ea065d" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.426623 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.453820 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.482565 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.541499 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.564188 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-288r6" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.570365 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.679813 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.736342 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-fz8w6" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.773396 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.814773 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.825158 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-8fx9r" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.899694 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.910550 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.927064 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-xw689" Nov 26 23:03:13 crc kubenswrapper[5008]: I1126 23:03:13.949855 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.040652 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.064916 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.085581 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.144818 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.184769 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.216268 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.253480 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.274821 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.285622 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.295914 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.333695 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.342094 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.429701 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.452908 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.504169 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.515772 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-d5mdr" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.562799 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.661241 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.662343 5008 scope.go:117] "RemoveContainer" containerID="80989502d207313e447e88e5d19ccee3fd036ade78c0c704099614965f715e13" Nov 26 23:03:14 crc kubenswrapper[5008]: E1126 23:03:14.662847 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=horizon-operator-controller-manager-5765b658-hkfvl_openstack-operators(0280585a-1314-4eac-9fc6-d83aa687a4f4)\"" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" podUID="0280585a-1314-4eac-9fc6-d83aa687a4f4" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.680458 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.680504 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.706337 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.707342 5008 scope.go:117] "RemoveContainer" containerID="b80ed22f44e34b4dd44dc2a17d123db6a473c4b94b630f33f1e91ea8b3cf3253" Nov 26 23:03:14 crc kubenswrapper[5008]: E1126 23:03:14.707717 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=glance-operator-controller-manager-76d465bf76-r74xk_openstack-operators(95e78ec8-1a94-47ca-b471-10ba505c5583)\"" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.731081 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.845285 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.847609 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.914736 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-x2qpv" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.915747 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-977dp" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.942587 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 26 23:03:14 crc kubenswrapper[5008]: I1126 23:03:14.965606 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.031481 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.032877 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.097988 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.126308 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-ct9sx" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.161223 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.213064 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.266085 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.371732 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.482584 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.507248 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.534786 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.609100 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.630112 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.645205 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.712129 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.786415 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.805549 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.908998 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.920672 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-8rxbq" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.944695 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 26 23:03:15 crc kubenswrapper[5008]: I1126 23:03:15.963648 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.055383 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.118271 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-995x9" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.136652 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.194207 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.287174 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.292635 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.393999 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.398177 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.444512 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.456263 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.479724 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.482056 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.488148 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.507632 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.518772 5008 scope.go:117] "RemoveContainer" containerID="cec3a7f0019d93ec1acf5547bc2a58306b2dbc62023e324be6e69b03a6661f03" Nov 26 23:03:16 crc kubenswrapper[5008]: E1126 23:03:16.519075 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=operator pod=rabbitmq-cluster-operator-779fc9694b-hg8m4_openstack-operators(d0c58bd4-2da5-4770-b562-aad453776b10)\"" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" podUID="d0c58bd4-2da5-4770-b562-aad453776b10" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.545316 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.569238 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-gw8b6" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.569998 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.602634 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.658807 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.694005 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.778700 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 26 23:03:16 crc kubenswrapper[5008]: I1126 23:03:16.849364 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.100442 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.157092 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.249987 5008 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.279331 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.280301 5008 scope.go:117] "RemoveContainer" containerID="186bdeccc77766b9f6f4646d0f1ac8b578e5875f33c3484214a87f7e351f0a8c" Nov 26 23:03:17 crc kubenswrapper[5008]: E1126 23:03:17.280700 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=infra-operator-controller-manager-5476c5fbf7-zg2vt_openstack-operators(9309fbab-00dc-4e76-a384-b9297f098fe9)\"" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.282669 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.332617 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.332689 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-rgs2j" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.444372 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.451230 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.463897 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.565806 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.565858 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.566920 5008 scope.go:117] "RemoveContainer" containerID="87df0209013418ff1f7b2ae433ff375a3ec956d39448e870540444f03dbbfaf0" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.567096 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.567123 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 26 23:03:17 crc kubenswrapper[5008]: E1126 23:03:17.567506 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=keystone-operator-controller-manager-7f6c555587-tvtn9_openstack-operators(72dcab39-084a-40c0-8646-acf173ea065d)\"" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" podUID="72dcab39-084a-40c0-8646-acf173ea065d" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.576362 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.595427 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.595577 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.597455 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.692194 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.711356 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.714510 5008 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.768026 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.809080 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.809080 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.849160 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-mmw8w" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.920826 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 26 23:03:17 crc kubenswrapper[5008]: I1126 23:03:17.944422 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.016272 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.030172 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.032270 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.041762 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.076398 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-pgdvf" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.126998 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.154312 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.246704 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.310623 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-hzbrd" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.383637 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.388686 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.440613 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.442433 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.522021 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.526468 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.533791 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-cr7cs" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.620099 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.664792 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.684490 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.729790 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.837793 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.865829 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.901411 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.909245 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.956557 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.974433 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 26 23:03:18 crc kubenswrapper[5008]: I1126 23:03:18.979313 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.020051 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.110590 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.124800 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.175093 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.207741 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.242594 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.305244 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.336505 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.336917 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.350961 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.367543 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.415862 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.458269 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.484184 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.523872 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.665087 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.832950 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.889209 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.935281 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 23:03:19 crc kubenswrapper[5008]: I1126 23:03:19.981912 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.015452 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-tdgbt" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.038545 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.064221 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.091904 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.113724 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.158303 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.166793 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.170432 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.196431 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.210994 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.252103 5008 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.264311 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.290380 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.321218 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.327643 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.327775 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.368847 5008 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.419908 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.585420 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.623199 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.668773 5008 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.680656 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=42.680624717 podStartE2EDuration="42.680624717s" podCreationTimestamp="2025-11-26 23:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:03:00.173673653 +0000 UTC m=+1455.586367695" watchObservedRunningTime="2025-11-26 23:03:20.680624717 +0000 UTC m=+1476.093318749" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.684434 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.684511 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.685100 5008 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca6251c6-3571-4928-a464-ab761a51d240" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.685149 5008 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca6251c6-3571-4928-a464-ab761a51d240" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.690223 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.692562 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.717927 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.717454664 podStartE2EDuration="20.717454664s" podCreationTimestamp="2025-11-26 23:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 23:03:20.714839791 +0000 UTC m=+1476.127533823" watchObservedRunningTime="2025-11-26 23:03:20.717454664 +0000 UTC m=+1476.130148696" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.756410 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.799920 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.828433 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.874043 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.875253 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.885602 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.891795 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.942552 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.961938 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.964358 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 26 23:03:20 crc kubenswrapper[5008]: I1126 23:03:20.982622 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.015672 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.033435 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.034650 5008 scope.go:117] "RemoveContainer" containerID="a09fcee0f5c0edbc112615a422820e105a94d7c6aff3cfac7a2719894397ff9f" Nov 26 23:03:21 crc kubenswrapper[5008]: E1126 23:03:21.035066 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=swift-operator-controller-manager-86cc6c797c-xj5wk_openstack-operators(aae33cf9-f71c-4878-86c4-218de3173f3a)\"" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" podUID="aae33cf9-f71c-4878-86c4-218de3173f3a" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.061905 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.135857 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.169048 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.176481 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.178377 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.206366 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.247710 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.255926 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.313173 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.406746 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.435520 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.445883 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.477688 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.507536 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.524372 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.554544 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.554930 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.612895 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.623653 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.624356 5008 scope.go:117] "RemoveContainer" containerID="cfb8800e64c0b7159f09b18917f5890a1f6db442bd40a08f6584c39513b07a6c" Nov 26 23:03:21 crc kubenswrapper[5008]: E1126 23:03:21.624731 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=mariadb-operator-controller-manager-75b97bfb54-shcmm_openstack-operators(a593559a-2caa-41b9-86bd-5f290b91f6ae)\"" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" podUID="a593559a-2caa-41b9-86bd-5f290b91f6ae" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.760771 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.825823 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.833035 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.859095 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.923054 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 26 23:03:21 crc kubenswrapper[5008]: I1126 23:03:21.953683 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.173245 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-qzk5r" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.177506 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.213437 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.227583 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.251447 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.332782 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.485737 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.494443 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.537726 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.570886 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.595831 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-vmzv2" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.601453 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.605413 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.620205 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.674758 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.711645 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.788004 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.862825 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.879430 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.880401 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.918723 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 26 23:03:22 crc kubenswrapper[5008]: I1126 23:03:22.987800 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.057415 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.111473 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.136465 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-dkwr2" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.325995 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.339392 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.374062 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.459581 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.572465 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.659561 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.659853 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-67b7t" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.742209 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.754593 5008 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.905626 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.927855 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.959279 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 26 23:03:23 crc kubenswrapper[5008]: I1126 23:03:23.998382 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.102526 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.130251 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.359437 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.362888 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.454737 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.482111 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.545001 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.622794 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.655508 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.706850 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.707753 5008 scope.go:117] "RemoveContainer" containerID="b80ed22f44e34b4dd44dc2a17d123db6a473c4b94b630f33f1e91ea8b3cf3253" Nov 26 23:03:24 crc kubenswrapper[5008]: E1126 23:03:24.708116 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=glance-operator-controller-manager-76d465bf76-r74xk_openstack-operators(95e78ec8-1a94-47ca-b471-10ba505c5583)\"" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" podUID="95e78ec8-1a94-47ca-b471-10ba505c5583" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.719021 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.732449 5008 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.793817 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.799174 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.849228 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.881832 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.889396 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 26 23:03:24 crc kubenswrapper[5008]: I1126 23:03:24.896908 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 26 23:03:25 crc kubenswrapper[5008]: I1126 23:03:25.022242 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 26 23:03:25 crc kubenswrapper[5008]: I1126 23:03:25.065785 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Nov 26 23:03:25 crc kubenswrapper[5008]: I1126 23:03:25.077715 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 26 23:03:25 crc kubenswrapper[5008]: I1126 23:03:25.228299 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 26 23:03:25 crc kubenswrapper[5008]: I1126 23:03:25.305159 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 26 23:03:25 crc kubenswrapper[5008]: I1126 23:03:25.455793 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 26 23:03:25 crc kubenswrapper[5008]: I1126 23:03:25.456455 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qf5sb" Nov 26 23:03:25 crc kubenswrapper[5008]: I1126 23:03:25.460207 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 26 23:03:25 crc kubenswrapper[5008]: I1126 23:03:25.523807 5008 scope.go:117] "RemoveContainer" containerID="c7d368ff4d915d18db481b35695be0e928b815aa52824929775bbc61499cb266" Nov 26 23:03:25 crc kubenswrapper[5008]: E1126 23:03:25.524015 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=metallb-operator-controller-manager-7d66f7697f-2vlzj_metallb-system(9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5)\"" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" podUID="9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5" Nov 26 23:03:25 crc kubenswrapper[5008]: I1126 23:03:25.700459 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 26 23:03:26 crc kubenswrapper[5008]: I1126 23:03:26.001286 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 26 23:03:26 crc kubenswrapper[5008]: I1126 23:03:26.084230 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 26 23:03:26 crc kubenswrapper[5008]: I1126 23:03:26.318824 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 26 23:03:26 crc kubenswrapper[5008]: I1126 23:03:26.325944 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 26 23:03:26 crc kubenswrapper[5008]: I1126 23:03:26.367325 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 26 23:03:26 crc kubenswrapper[5008]: I1126 23:03:26.446397 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 26 23:03:27 crc kubenswrapper[5008]: I1126 23:03:27.032068 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 26 23:03:27 crc kubenswrapper[5008]: I1126 23:03:27.279874 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 23:03:27 crc kubenswrapper[5008]: I1126 23:03:27.281128 5008 scope.go:117] "RemoveContainer" containerID="186bdeccc77766b9f6f4646d0f1ac8b578e5875f33c3484214a87f7e351f0a8c" Nov 26 23:03:27 crc kubenswrapper[5008]: E1126 23:03:27.281457 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=infra-operator-controller-manager-5476c5fbf7-zg2vt_openstack-operators(9309fbab-00dc-4e76-a384-b9297f098fe9)\"" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" podUID="9309fbab-00dc-4e76-a384-b9297f098fe9" Nov 26 23:03:27 crc kubenswrapper[5008]: I1126 23:03:27.383006 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 26 23:03:27 crc kubenswrapper[5008]: I1126 23:03:27.466899 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Nov 26 23:03:27 crc kubenswrapper[5008]: I1126 23:03:27.518400 5008 scope.go:117] "RemoveContainer" containerID="cec3a7f0019d93ec1acf5547bc2a58306b2dbc62023e324be6e69b03a6661f03" Nov 26 23:03:28 crc kubenswrapper[5008]: I1126 23:03:28.542053 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hg8m4" event={"ID":"d0c58bd4-2da5-4770-b562-aad453776b10","Type":"ContainerStarted","Data":"68700bfc1db3855a92abef5646db59beed603e98347776c0f97d0bc50c49dcbc"} Nov 26 23:03:29 crc kubenswrapper[5008]: I1126 23:03:29.280867 5008 patch_prober.go:28] interesting pod/machine-config-daemon-4qkmj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 23:03:29 crc kubenswrapper[5008]: I1126 23:03:29.280926 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4qkmj" podUID="8e558d58-c5ad-41f5-930f-36ac26b1a1ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 23:03:29 crc kubenswrapper[5008]: I1126 23:03:29.518778 5008 scope.go:117] "RemoveContainer" containerID="80989502d207313e447e88e5d19ccee3fd036ade78c0c704099614965f715e13" Nov 26 23:03:30 crc kubenswrapper[5008]: I1126 23:03:30.518441 5008 scope.go:117] "RemoveContainer" containerID="87df0209013418ff1f7b2ae433ff375a3ec956d39448e870540444f03dbbfaf0" Nov 26 23:03:30 crc kubenswrapper[5008]: E1126 23:03:30.518941 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=keystone-operator-controller-manager-7f6c555587-tvtn9_openstack-operators(72dcab39-084a-40c0-8646-acf173ea065d)\"" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" podUID="72dcab39-084a-40c0-8646-acf173ea065d" Nov 26 23:03:30 crc kubenswrapper[5008]: I1126 23:03:30.587948 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" event={"ID":"0280585a-1314-4eac-9fc6-d83aa687a4f4","Type":"ContainerStarted","Data":"d829235b1f7cbe0c841c070d8dc2b477e04e47d7fbf855d45cdea86f041a84e4"} Nov 26 23:03:30 crc kubenswrapper[5008]: I1126 23:03:30.588234 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 23:03:32 crc kubenswrapper[5008]: I1126 23:03:32.519027 5008 scope.go:117] "RemoveContainer" containerID="a09fcee0f5c0edbc112615a422820e105a94d7c6aff3cfac7a2719894397ff9f" Nov 26 23:03:32 crc kubenswrapper[5008]: I1126 23:03:32.963199 5008 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 23:03:32 crc kubenswrapper[5008]: I1126 23:03:32.963431 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://91f8c3ab3f4a5f7b54e77b3e6755816591a0a58f4d462a0cb257a42534c04220" gracePeriod=5 Nov 26 23:03:33 crc kubenswrapper[5008]: I1126 23:03:33.622674 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" event={"ID":"aae33cf9-f71c-4878-86c4-218de3173f3a","Type":"ContainerStarted","Data":"1559f32d9e2618f87c581daa8716c2a366fd6ad9ca2cb0294baa94aaeca02788"} Nov 26 23:03:33 crc kubenswrapper[5008]: I1126 23:03:33.623548 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 23:03:34 crc kubenswrapper[5008]: I1126 23:03:34.668343 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5765b658-hkfvl" Nov 26 23:03:35 crc kubenswrapper[5008]: I1126 23:03:35.532848 5008 scope.go:117] "RemoveContainer" containerID="cfb8800e64c0b7159f09b18917f5890a1f6db442bd40a08f6584c39513b07a6c" Nov 26 23:03:36 crc kubenswrapper[5008]: I1126 23:03:36.519174 5008 scope.go:117] "RemoveContainer" containerID="b80ed22f44e34b4dd44dc2a17d123db6a473c4b94b630f33f1e91ea8b3cf3253" Nov 26 23:03:36 crc kubenswrapper[5008]: I1126 23:03:36.664612 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" event={"ID":"a593559a-2caa-41b9-86bd-5f290b91f6ae","Type":"ContainerStarted","Data":"ad5b9d49ea0229f31bb7b4ebc25e82972941a3ca22b23abac7970a3362752ce7"} Nov 26 23:03:36 crc kubenswrapper[5008]: I1126 23:03:36.665273 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 23:03:37 crc kubenswrapper[5008]: I1126 23:03:37.518767 5008 scope.go:117] "RemoveContainer" containerID="c7d368ff4d915d18db481b35695be0e928b815aa52824929775bbc61499cb266" Nov 26 23:03:37 crc kubenswrapper[5008]: I1126 23:03:37.678682 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" event={"ID":"95e78ec8-1a94-47ca-b471-10ba505c5583","Type":"ContainerStarted","Data":"723e7709e4f9f38cb0e392b3bbe5c1f92ee4c05e3c2ed045da92b2b8a2017ee7"} Nov 26 23:03:37 crc kubenswrapper[5008]: I1126 23:03:37.679934 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.575398 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.576495 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.656983 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.657061 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.657120 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.657145 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.657174 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.657261 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.657298 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.657346 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.657336 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.657742 5008 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.657767 5008 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.657778 5008 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.657789 5008 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.673000 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.687149 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.687194 5008 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="91f8c3ab3f4a5f7b54e77b3e6755816591a0a58f4d462a0cb257a42534c04220" exitCode=137 Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.687261 5008 scope.go:117] "RemoveContainer" containerID="91f8c3ab3f4a5f7b54e77b3e6755816591a0a58f4d462a0cb257a42534c04220" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.687363 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.700178 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" event={"ID":"9b33e0c3-fdbc-41ff-8c6c-8581e4e641c5","Type":"ContainerStarted","Data":"8f71c145bc469c5ce1b38552de5e2ffd358f602f428298a8087f3bafa92c56a9"} Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.700499 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7d66f7697f-2vlzj" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.758950 5008 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.769885 5008 scope.go:117] "RemoveContainer" containerID="91f8c3ab3f4a5f7b54e77b3e6755816591a0a58f4d462a0cb257a42534c04220" Nov 26 23:03:38 crc kubenswrapper[5008]: E1126 23:03:38.771030 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91f8c3ab3f4a5f7b54e77b3e6755816591a0a58f4d462a0cb257a42534c04220\": container with ID starting with 91f8c3ab3f4a5f7b54e77b3e6755816591a0a58f4d462a0cb257a42534c04220 not found: ID does not exist" containerID="91f8c3ab3f4a5f7b54e77b3e6755816591a0a58f4d462a0cb257a42534c04220" Nov 26 23:03:38 crc kubenswrapper[5008]: I1126 23:03:38.771127 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f8c3ab3f4a5f7b54e77b3e6755816591a0a58f4d462a0cb257a42534c04220"} err="failed to get container status \"91f8c3ab3f4a5f7b54e77b3e6755816591a0a58f4d462a0cb257a42534c04220\": rpc error: code = NotFound desc = could not find container \"91f8c3ab3f4a5f7b54e77b3e6755816591a0a58f4d462a0cb257a42534c04220\": container with ID starting with 91f8c3ab3f4a5f7b54e77b3e6755816591a0a58f4d462a0cb257a42534c04220 not found: ID does not exist" Nov 26 23:03:39 crc kubenswrapper[5008]: I1126 23:03:39.518283 5008 scope.go:117] "RemoveContainer" containerID="186bdeccc77766b9f6f4646d0f1ac8b578e5875f33c3484214a87f7e351f0a8c" Nov 26 23:03:39 crc kubenswrapper[5008]: I1126 23:03:39.535173 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 26 23:03:39 crc kubenswrapper[5008]: I1126 23:03:39.535773 5008 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Nov 26 23:03:39 crc kubenswrapper[5008]: I1126 23:03:39.558191 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 23:03:39 crc kubenswrapper[5008]: I1126 23:03:39.558257 5008 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="11a74552-14b1-42e4-8f19-66bf8e581055" Nov 26 23:03:39 crc kubenswrapper[5008]: I1126 23:03:39.569837 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 23:03:39 crc kubenswrapper[5008]: I1126 23:03:39.569904 5008 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="11a74552-14b1-42e4-8f19-66bf8e581055" Nov 26 23:03:40 crc kubenswrapper[5008]: I1126 23:03:40.726492 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" event={"ID":"9309fbab-00dc-4e76-a384-b9297f098fe9","Type":"ContainerStarted","Data":"780ca27a05281e70cf6e1dd4048770d009652e7a1eb1989856e4a583b66105d9"} Nov 26 23:03:40 crc kubenswrapper[5008]: I1126 23:03:40.727549 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 23:03:41 crc kubenswrapper[5008]: I1126 23:03:41.040471 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-86cc6c797c-xj5wk" Nov 26 23:03:41 crc kubenswrapper[5008]: I1126 23:03:41.629154 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-75b97bfb54-shcmm" Nov 26 23:03:44 crc kubenswrapper[5008]: I1126 23:03:44.712089 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-76d465bf76-r74xk" Nov 26 23:03:45 crc kubenswrapper[5008]: I1126 23:03:45.528602 5008 scope.go:117] "RemoveContainer" containerID="87df0209013418ff1f7b2ae433ff375a3ec956d39448e870540444f03dbbfaf0" Nov 26 23:03:45 crc kubenswrapper[5008]: I1126 23:03:45.786085 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" event={"ID":"72dcab39-084a-40c0-8646-acf173ea065d","Type":"ContainerStarted","Data":"a6aa8b4c6d94aed1d309ba725e89a4ff01f465dad7ef6c676dc0bf01d518afcd"} Nov 26 23:03:45 crc kubenswrapper[5008]: I1126 23:03:45.786387 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7f6c555587-tvtn9" Nov 26 23:03:47 crc kubenswrapper[5008]: I1126 23:03:47.286530 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5476c5fbf7-zg2vt" Nov 26 23:03:50 crc kubenswrapper[5008]: I1126 23:03:50.076559 5008 scope.go:117] "RemoveContainer" containerID="4f44baf2490363918a126873442f5105a7450a2639ae2508996eeecd63a74d92"